Who are CellVoyant?

CellVoyant is a biotechnology company that predicts stem cell differentiation using live cell microscopy and artificial intelligence. We use this approach to optimise and unlock human tissue manufacturing for research and therapeutics applications. We aim to understand and solve important health issues, make a long-lasting positive impact on society and change the world.

We spun out from the Carazo Salas lab at the University of Bristol in 2022 and are backed by venture capital firms who were the earliest investors in DeepMind, Exscientia, Recursion, Wayve, and Abcam.

What we’re looking for

As a Machine Learning Engineer, you will be an integral part of the Machine Learning team (who build, integrate, test and scale our computer vision for cell fate prediction), taking the ML models and bringing them into production. Working in tandem with our Engineering team (who build our infrastructure and internal applications) and our biology team (who design and run our cell biology experiments for microscopy data collection and differentiation protocol development).
You will be an experienced Machine Learning Engineer with a solid knowledge of ML models for computer vision and an understanding of modern ML frameworks to deploy these models into the production environment.
You will build software that the Biology team uses to capture, store, and analyse microscopy data, and that the Machine Learning team uses to build, integrate, test and scale ML features for production.

What you’ll do in this role

– Working as an integral member of our AI/ML team to define and build our internal training, testing, and serving infrastructure.
– Establishing the best models to apply to our data, owning how we engineer these into production.
– Help decide on the best ML frameworks – PyTorch, Detectron
– You will work closely and collaboratively with the wider team on project deliverables.
– Define and build internal applications and data products for our Biology team with a focus on scalable and reproducible workflows. This includes a platform for storing, visualising, and analysing high resolution microscopy data and tracking all experimental data.

The interview process:

– Initial phone screen (30 mins)
– At-home technical test
– Technical interview reviewing the test results and follow-up (90 mins)
– Leadership, Culture & Behaviours Interview (90 mins)
– Meet the Team (onsite including tour 60 mins)


– At least 2 to 3+ years as a Machine Learning Engineer within real-world data environment. Taking ML models all the way into production.
– Solid knowledge of ML models for computer vision, and an understanding of modern ML frameworks to deploy these models into the production environment.
– Experience of working within a collaborative multi-function team environment.
– Experience of Python programming, ML frameworks such as PyTorch.
– Expertise with cloud infrastructure, distributed systems, continuous integration systems and unit testing.
– High standards for clear, actionable communication of data-driven processes.

Nice to haves

– Experience in a biotechnology company or a background in biology are not necessary, but a genuine interest in this area will be of benefit.
– Worked with data at scale.
– Experience and desire to research relevant scientific publications.
– Experience working on AI-first software products, data platforms, and machine learning infrastructure tools, and bonus if in the context of biotechnology.


– Join at the ground level to work at the cutting-edge of artificial intelligence, stem cell biology, empirical experiment automation, and cell therapy development.
– An inclusive, collaborative and intellectually stimulating culture that puts science at the forefront of everything we do.
– A dynamic, diverse, and inclusive team of experienced and interdisciplinary scientists applying their skills to some of the most impactful problems in human health.
– Competitive salary and founding equity compensation.
– Ability to work remotely or in our Bristol, UK headquarters and join bi-annual week-long company off sites.