From months to weeks: how The BioForge is using AI to unblock the slowest part of drug development
Interview with Keerthi Prasad Venkataramanan, CEO and founder, The BioForge
Estimated reading time: 9 minutes

Bioprocess development sits between discovery and clinical trials, an iterative, empirical stretch of work that turns a promising molecule into something a manufacturer can reproduce at scale. It rarely makes headlines. It is also where many rare disease therapies quietly stall. Keerthi Prasad Venkataramanan, founder of The BioForge, believes machine learning can compress the bottleneck from months into weeks, making therapies for the smallest patient populations economically viable in the process
Keerthi Prasad Venkataramanan first encountered a bioreactor as an undergraduate in India, working on a final-year research project. The image that formed then has shaped his career since: a bioreactor as the bridge between an idea on a laboratory bench and a treatment that could one day reach a patient. “Bioreactors provide a controlled, reproducible way of carrying out the production of cells, protein biologics, DNA,” he says. He went on to a PhD in biotechnology focused on process development, a postdoc in systems biology and a career across industrial biotechnology and pharma.
Across that career, one problem kept reappearing. The discoveries that excited the field happened at small, high-throughput scale. Translating them into something manufacturable, at the right quality, the right yield and reliably enough to enter the clinic, was an entirely different challenge, and the field was solving it slowly and expensively. He founded The BioForge two years ago to attack that bottleneck directly, applying artificial intelligence and multi-objective optimisation to shorten the loop between an experiment and a useful answer.
The slowest part of drug development
Bioprocess development is the work of figuring out how to grow, harvest and purify a biological product so that what comes out at the end is the same molecule that worked at the bench, only in far greater quantities. Some of its slowness is unavoidable. “If you are making antibodies with CHO cells, it takes 10 to 14 days in the main tank to produce,” Keerthi explains. “Then there is growing the seed, so the actual process may be more like a month, and then you have to analyse the samples, which could add another week or two.” Microbial systems run faster, but the principle is the same: biology keeps its own clock.
What can be changed, Keerthi argues, is the iterative nature of the work. Process development is empirical. What works for one product rarely transfers to the next. Teams run experiment after experiment, varying media, temperature, pH, oxygen, feed strategy, learning a little each time. The dominant approach is design of experiments (DOE), a family of statistical methods that select factors and levels to test. Keerthi is careful with his criticism: “DOE is rigorous and statistically grounded, but it is not inherently designed to optimise toward a specific outcome under complex biological constraints,” he says. In a system as complex as a living cell, the gap between a sound design and a useful result translates into more experiments, more cost and more time.

When sound design does not guarantee a useful answer
The BioForge’s platform, BioOptima AI, takes the opposite starting point. Instead of asking which factors to vary, it begins with the objective: produce more cells of the right quality and the right state, given these constraints. Rather than analysing experiments after the fact, BioOptima AI determines what experiment should be run next under real-world laboratory constraints. The optimisation algorithms then work backwards, treating each round of experiments not as a one-shot answer but as a step that informs the next.
“Each set of experiments is not a one-stop solution, but it paves the way for the next best experiment,” Keerthi explains. The platform is dynamic, agnostic to cell type and product modality, and able to work with whatever number of conditions a client can physically test in a given cycle. In many cases, teams converge toward optimal conditions within three adaptive cycles.
It also reframes a common limitation as a feature. Conventional approaches struggle when there is little prior data; BioOptima AI is designed to explore from a cold start.
“Limited data is not a disadvantage,” Keerthi says. “It can explore the conditions and find the optima.”
For contract research organisations, where data from one client cannot legally be reused for another, that capability matters. BioOptima AI operates independently of hardware and integrates across scales, from microtiter plates to bioreactors.
From a ten-month problem to a six-week answer
The clearest illustration comes from the company’s first client, a startup that had spent eight to ten months trying to scale up plasmid DNA production for a cell and gene therapy programme. Plasmid DNA carries a particular difficulty. Many of the sequences used in cell and gene therapy come from human or mammalian biology, and the bacterial hosts used to manufacture them tend to delete those sequences, producing something that is not what was designed. The team needed to solve two problems at once: make the right product, and make more of it.
Using BioOptima AI, the client reached a solution in four weeks across three rounds of optimisation, then troubleshot scale-up in two further weeks. Six weeks, in total, against eight to ten months of prior effort. The product subsequently launched and the company expanded its business to operate as a contract development and manufacturing organisation, applying the same scaled platform to other developers. To date, The BioForge has completed seven deployments with design partners, spanning multiple organisms and product modalities, optimising for yield, titre, product quality and impurity profiles. These engagements are structured as platform deployments, where BioOptima AI drives experimental decision-making within the client’s existing lab infrastructure.
A platform built for the batch of one
In rare disease, the implications sharpen. Conventional bioprocess development assumes a patient population large enough to absorb months of optimisation work. For an ultra-rare condition, or for a precision medicine designed for a single patient, that assumption falls apart. Keerthi sees this as a structural opportunity. “What we are doing in process development is already at a personalised medicine level,” he says, “because for each drug molecule you have to develop a process. If you take the same rare disease and you have different patients you have to test, a process developed for one patient need not work for the other.”
A platform that can compress a development cycle from months to weeks, and that improves with each iteration, changes what is feasible. Keerthi points to recent United States Department of Health guidance permitting development pathways for therapies aimed at a single patient as a meaningful shift.
“These efforts, there is nothing in vain,” he says. “It is just finding something that works better for a given patient.”
Keerthi is in active conversation with academic spin-outs and early-stage developers working on cell and gene therapies, including in adeno-associated virus (AAV) production, where he sees particular scope to improve transfection efficiency and reduce the proportion of empty vectors that drive immunogenic responses in patients. This is particularly relevant for modalities like AAV and plasmid DNA, where process conditions directly impact efficacy, safety and manufacturability.

The economics that decide who gets treated
Cost of goods is the other side of the same argument. Cell and gene therapies currently sit at six- and seven-figure prices per dose, putting them out of reach for many patients. The BioForge’s aim of reducing cost of goods by up to 50% rests on two levers: fewer development iterations, which lower the cost of reagents and assays, and improved process efficiency, which reduces the bill of materials per unit produced. Early deployments have shown significant reductions in experimental burden and development timelines. “It really tilts health equity against the people who need them the most,” Keerthi says of current pricing. Reducing the development burden does not solve that problem on its own, but it is one of the few levers that can be pulled early enough to matter.
On data integrity, a non-negotiable in a regulated industry, the platform operates on a cloud architecture with strict client segregation. The processes developed for a given client become that client’s intellectual property; The BioForge retains its algorithms. The work is positioned at the early stage of process development, where the constraints of downstream manufacturing can be built in from the start, reducing the need for re-engineering during tech transfer.
Getting to the success stories faster
Keerthi returns often to the “valley of death” in biotech, the gap between a product that works in a lab and a process that scales to the clinic. It is the gap his platform was built to close.
“AI has been doing a really great job in discovering new therapies, but biology becomes a limit,” he says. “Some of those discovered therapies are not really manufacturable. We have seen AI impact discovery. Now it is time for AI to impact R&D towards bio-manufacturing.”
Even failure, he adds, is acceptable on those terms, as long as it is the right kind. “How can we fail faster, and get to those success stories much quicker?”
For the families waiting at the other end of the development pipeline, that question is not abstract. It is the difference between a therapy that exists and one that does not.
To learn more please visit:thebioforge.com
Connect with Keerthi
in the spotlight profiles innovative companies working in the RARE space. To access more in the spotlight articles click below.
