Accelerating Drug Development with Microphysiological Systems (MPS)

Current preclinical testing using assays based on cancer cells and animal models often results in high failure rates for drug candidates in the clinic. Microphysiological systems like Altis’s RepliGut™ human intestinal tissue model more accurately reflect human physiological environments, enabling more efficient and cost-effective drug discovery and development.

Using Human Tissue for Drug Screening

The inadequacies of current preclinical drug development methods have left the industry with an unacceptable clinical trials failure rate and nearly $200 billion spent every year on research and development. A key factor driving clinical failures is the use of tumor cell lines such as Caco-2 cells and animal usage in preclinical testing. Thus, too often ineffective drugs reach clinical trials, while potentially efficacious medicines do not. An alternative and more promising approach to drug discovery, preclinical testing, and toxicity screening is the use of human tissue models. 

Combining Engineering, Chemistry, and Physics

Altis Biosystems was spun out from the University of North Carolina at Chapel Hill; our company’s goal is to apply engineering, chemistry, and physics to new strategies that lead to successful, faster drug development. We utilized methods from the semiconductor and electronics industries to build small-scale devices in an efficient, scalable, and cost-effective manner, seeking to marry this approach with stem cell technology. 

We focused on the large and small intestines because they play crucial roles in the absorption and metabolism of drugs. Studies have proven that the bacterial composition in the colon can modulate immune responses and may play a role in many diseases, which also contributed to our interest.

Starting Simple

Capitalizing on research on growing organoids from human intestines, Altis has developed a next-generation intestinal platform for in vitro testing during drug development. The platform produces a layer of human intestinal stem and differentiated cells — either of the large or small intestine — that can be used for compound screening, disease modeling, and microbiome research.

RepliGut™ tissue constructs are polarized monolayers that express tight junction proteins and can be tailored to include stem/progenitor cells, differentiated cells, or both, representing all major cell lineages in physiologic ratios. Each tissue sample on the RepliGut™ kit features a patent-pending biomimetic scaffold that separates RepliGut™ cells from the cassette’s porous membrane and allows cells to survive for a prolonged period of time. Luminal and basal reservoirs allow compounds and additional cell types to interact with the epithelial cells for side-specific assays.

Providing Diversity

With our donor bank comprising tissue from multiple donors of different demographic backgrounds, Altis is able to make realistic models that recapitulate many different physiologies and thus provide a faithful representation of the diversity of the human population. We have developed a suite of platforms and assays tailored for specific applications to investigate drug absorption, transport, toxicity, and inflammatory cytokines. 

Our commercially available RepliGut™ system is compatible with the vast majority of assays commonly used in the biopharmaceutical industry. Gene expression, protein expression, cytokine production, permeability, transport, toxicity, inflammatory response, and other attributes can be evaluated using ELISA, PCR, transepithelial electrical resistance (TEER), immunofluorescence, mass spectrometry, microscopy, and other techniques.

Differentiated from Other Organs-on-a-Chip

Unlike most other human intestinal tissue models, Altis has included both stem cells and differentiated cells. The presence of stem cells is important for evaluating tissue repair and drugs to treat cancer. The RepliGut™ platform can also include cells that secrete hormones, cytokines, and other biochemicals involved in the proper function — and dysfunction — of the large and small intestine. Most other models do not have this capability. We also include mature enterocytes, goblet cells, and enteroendocrine cells. These combined features result in a more sophisticated system than other tissue models, allowing for more faithful predictions of the impact of drug candidates. 

Fits into Existing Workflows

In addition to providing a human tissue model for the large and small intestines that faithfully recapitulates the behavior of these important organs from a wide array of demographic groups and for many different applications, Altis has focused on developing a platform technology that is easy for researchers to use. 

Unlike many microfluidic devices that require the purchase of large, specialized instruments and the connection of tubes and pumps, the RepliGut™ system has been designed in a footprint that slots into existing workflows that the biopharmaceutical industry already uses. 

The RepliGut™ kit includes a Transwell multiwell plate comprising 6, 12, 24, or 96 wells, with each well containing a Transwell insert with an individual tissue sample. Each kit also includes the materials needed to culture the cells, including stem cells and media. To populate the device, the stem cells are placed on the scaffold, and a few media changes are completed via pipetting. The tissue structure self-assembles into the lineages populating the intestinal epithelium.

Moving to the Next Level

Microbiome-based therapeutics is a rapidly growing field that would benefit greatly from access to better preclinical testing methods. It is clear that bacteria in the colon influence every aspect of human body functions, from mentation and satiety to metabolism and response to chemotherapeutics.

Education of the immune system takes place to some extent in the gut. Incorporating a functioning immune system into the RepliGut™ epithelium will in the future facilitate exploration of the mechanisms involved in immune education and the impact of different microbes.

The intestine also plays a significant role in the nervous system. One of the largest sites for the production of serotonin — a neurotransmitter — is in the gut. Combining aspects of the nervous system into the RepliGut™ system is an exciting avenue currently under investigation at Altis.

Leaky gut is a significant problem from the neonate to the adult. A robust and reliable human tissue model that enables the investigation of this phenomenon and drugs that can prevent or treat gut damage holds tremendous promise. 

Developing RepliGut™ models specifically designed to enable investigation of treatments for colon cancer presents yet another opportunity for Altis. Among all types of cancer, colon cancer causes the third most deaths among both men and women.1

In the near term, Altis has much interesting work ahead of us as we add other types of tissues and features to the RepliGut™ human tissue model for the large and small intestines. Our hope is to eventually supplant old tumor cell (Caco-2) models with models like our RepliGut™ platform that more accurately reflect what occurs in the human body, enabling more efficient and accurate drug discovery and development. 

Reference

  1. “Key Statistics for Colon Cancer.” American Cancer Society. 29 Jun. 2020. Web.

Originally published on PharmasAlmanac.com on September 29, 2020.

Accelerating Drug Development with AI-Enabled 3D Spatial Biology

As drug developers experience increased pressures to rapidly and efficiently develop new medicines to address a wide range of unmet medical needs, pathology remains an essential activity in understanding the safety and efficacy of drugs, but it is inherently limited by manual methods, analog 2D analysis, and subjective interpretation. To overcome these limitations and transform pathology to better meet the needs of drug developers, Alpenglow Biosciences is pioneering 3D spatial biology, which combines the company’s unique imaging technology with cloud computing and AI analysis. In this Q&A, Alpenglow founder and CEO Nick Reder, M.D., discusses the history of pathology in drug development, the promise of 3D spatial biology, and the collaborations and other work the company is using to realize their vision, with Pharma’s Almanac Editor in Chief David Alvaro, Ph.D.

David Alvaro (DA): To begin, can you walk me through the origins of Alpenglow Biosciences and the technology and vision on which the company was founded?

Nicholas Reder (NR): Alpenglow is a 3D spatial biology company. We are transforming the field of pathology from 2D analog glass slides to 3D direct-to-digital intact tissue imaging with AI and saving the tissue for downstream sequencing analyses, which truly represents a sea change in the way tissue is analyzed. The project began when I was a first-year pathology resident at the University of Washington. I approached a mechanical engineer who had just arrived on campus (Dr. Jonathan Liu) and asked if he could build a flatbed scanner for tissue that allowed us to put tissue directly on the scanner, scan it, and get a digital image instead of having to go through the long, arduous process of making a slide and then digitizing it.

We started there, working with Dr. Adam Glaser, a post-doc in the UW Department of Mechanical Engineering working under Dr. Liu. We were building a microscope and associated software that realized that vision and provided a new 3D view of tissue. We were very proud of that accomplishment, but we realized that if we didn’t commercialize it, there would just be one lab at the University of Washington talking about how cool the technology was. We’d probably generate a lot of great publications, but we wouldn’t really make an impact on the field or for patients.

In late 2018, we spun the company out under the name Lightspeed Microscopy. We originally selected an outside CEO, Steve Buckley, to run the company, but after about six months, we mutually concluded that the company would be better served by a scientific founder who knew the technology and the market very well, and so I took over as CEO. We raised some money from Hamamatsu and Washington Research Foundation Capital and started to work, generated some traction, and picked up some customers and some government grants.

In June 2021, when we were a three-person company, we raised a $4 million round of financing from Dynamk Capital, which really accelerated our business. From 2021 to 2022, we grew our revenue tenfold and expanded from a team of three to a team of 21. We recently formed a partnership with Mayo Clinic, and we’re starting to take our technology into clinical trials, which was always one of the dreams. The Dynamk money really catalyzed a lot of our success, and we are starting to see our technology be adopted around the world and in clinical settings, which is really exciting.

DA: Can you discuss the role of spatial biology in drug development to date and the limitations or pain points of using exclusively 2D imaging?

NR: Decades ago, looking at tissue under a microscope was almost a check-the-box exercise in drug development rather than a highly valued part of the process. Then the sequencing wave hit and was transformative, but it fed into more anticipation of further techniques to learn more from tissue. It was very clear that there is a wealth of potential information that could influence drug development beyond what can be gleaned by just looking at 2D slides and then grinding tissue and finding assays where you disassociate everything.

The field of spatial biology really took form about 10 years ago when we started to multiplex and add more and more biomarkers to 2D slides. Over the past few years, it has really gained lot of traction, and people have started to realize that the tissue is telling a story, but we have only been looking at the first chapter of the book. We feel that our contribution to the field is allowing researchers to look at the whole book. 3D allows a comprehensive view of the tissue and its story and reinforces how we have only scratched the surface to date.

DA: Had there been any previous attempts to transition from 2D to 3D imaging?

NR: Within our field, we use a specific type of microscope called a light-sheet fluorescence microscope to look at tissue in 3D, and we first use tissue clearing to make the tissue transparent. There is a cool history of these technologies and their combination that dates back to the 1920s. So, this isn’t a new thing — people have been looking at tissue in 3D for a century now, but there have been practical limitations of the technology. Our technology reflects a fortuitous confluence of technological advancements in camera optics and computing, moving from the level of visual observation to scanning a piece of tissue, reconstructing it digitally, and using algorithms to quantity different aspects of the tissue: That’s something that hasn’t been possible until the advent of GPU (graphics processing unit) and sCMOS (complementary metal oxide semiconductors) cameras over the past decade.

This confluence of enabling technologies makes things possible on a much broader scale. Within academic environments, this combination of technologies — light-sheet and computing — has been around since the mid 2000s and gained a lot of headway in developmental biology and in neuroscience. But it hasn’t entered drug development, especially clinical diagnostics. At Alpenglow, we believe that one of the main reasons that there hasn’t been wider adoption is that all the technologies are designed for one-off experiments. But when one needs to, for example, test a drug on 20 animals and examine all their livers and reach an answer fast to make a decision on the viability of a program — that’s where our technology is really enabling.

DA: Can you expand a little bit about how AI and cloud computing come into play here and why they are critical enabling technologies in your work?

NR: The way I look at things is that AI is simply algorithms that help you with work that you would rather not do manually. And then performing those computations is a constant evolution — on-prem, then you go to the cloud, and the technology improves, then you go back on-prem — we’re going to keep on alternating like that. We sometimes generate multiple terabytes of data per day from a single piece of tissue, which completely overwhelms current systems, and many tasks are impossible to execute manually. Consequently, we have to develop these algorithms to process the images and the final beautiful data tag or to segment the different structures within the tissue, different types of cells or vessels or fibrosis, which really can’t be done manually.

This hit me when I was analyzing a prostate biopsy during a study in the academic environment. I sat down with the images and circled all the cancer and then reconstructed the tissue. Ten hours later, I was done with my first data set, but I knew that there was no way that I would do another 1,199 in the same way. That’s when we started to develop AI methods.

DA: What do you envision as the pragmatic impact that the technology can have on drug discovery and development, in terms of time and cost but also data quality and so on?

NR: Our challenge has been that the range of possible applications is incredibly broad. We’re trying to focus on specific use cases where the value proposition is obvious and then build from there. One example of what we’ve done is in immuno-oncology, looking at structures called tertiary lymphoid structures (TLS), which are the result of the immune system recognizing a tumor and essentially camping out to create a factory to attack the tumor, and they’re associated with better prognosis. When you take only a 2-dimensional cut of a tumor, it can be hard to tell if something is just a cluster of immune cells or one of these TLS factories fighting the tumor. In 3D, you can view the interconnectedness, which has broad implications for predicting which patients would benefit from immune therapy and potentially for iterating on therapies and developing combination therapies.

Another use case that is more narrowly defined but has a very clear value proposition is in working with skin biopsies for atopic dermatitis. Normally, trials are conducted based on subjective clinical symptoms: Is the itch going away? But to really understand whether a drug is working, you want to be able to visualize the skin tissue actually changing in response to the drug. We’ve been working on looking at the peripheral nervous system in 3D, because it is essentially hidden in 2D because the structures are so thin and complex. When we image those skin biopsies in 3D, you can see the intricate network of nerves innervating the skin, and you can see changes in disease and with treatment and actually quantify and support the efficacy of your drug.

DA: I understand the strong need to focus on some clear use cases for now, but I’m curious: how do you view the full scope of possible applications of your technology?

NR: The way we see it: anytime there is tissue, whether it’s from an animal or from a human, you’re probably going to want to use our technology. There is no reason not to use 3D in most cases. We’ve done a lot of work in animal models and with human tissue, and we see uses spanning the entire gamut. We place our devices around the world in both basic research settings and in clinical laboratories and clinical research groups as well.

When data flow through our microscope, we constantly feed our algorithms to improve both the data processing and the data analysis — the more tissue that comes in, the better those algorithms get. Eventually, we want to cover so much tissue that we really develop an excellent understanding of the spectrum of disease and how different drugs could treat different diseases and build predictive algorithms to examine a 3D structure of a biopsy and generate not only a diagnosis but a prognosis and treatment recommendation.

DA: Is there a translation barrier to take this new kind of information and align it with the more traditional way of interrogating these questions?

NR: I wouldn’t say there’s a translation barrier because we can still look at a 2D section of that data and either virtually or physically cut a section. Because we’re viewing the tissue in 3D, we can potentially create a new gold standard for many diseases or situations. So, then, the question becomes: what gold standard do we want to create?

In one example, we were looking at celiac disease biopsies, in which there is destruction of the small intestine, so instead of having countless numbers of villi, they become flattened, which leads to malabsorption and nutrient deficiencies.

We started to image these samples in 3D, and we found that an old scoring system was not only variable but misleading. Now we can measure the surface of that biopsy in 3D, but we have to make some choices about how to measure it — volume or surface area? What’s the best way to reassign this gold standard?

DA: Are there many other groups working on these same challenges, or is Alpenglow on its own for now?

NR: I would consider us in a bit of an open space. There are quite a few other tools to image tissue in 3D, but nothing that is particularly well-suited for drug development and clinical applications. And there aren’t really any companies that go end-to-end like we do, which is both an advantage and a disadvantage. The advantage is that we don’t have as much competition, but the disadvantage of being first in the field is overcoming skepticism of a novel approach. We’re introducing a completely new paradigm, so it’s more about convincing people to adopt the new paradigm than to choose your product over someone else’s product.

DA: Have some organizations been more receptive to the new paradigm than others?

NR: It’s very dependent on the use case and the need and the value that we provide. If we can provide an answer in a really high-impact area that accelerates a drug program, and it can’t be obtained any other way, there is great appetite for working with us. If we are talking more about an incremental improvement, then some of the valid risks of adopting a new technology are salient. We have to think wisely about how we apply our technology because it can be applied in so many situations, but we really want to focus on the highest-impact, biggest needs: places where you can work with Alpenglow to do things you can’t do otherwise.

DA: In terms of the technology itself, are there still some gaps in functionality that you are working to bridge, or is it more a case of optimizing what is already available?

NR: We are mostly focused on optimization: the microscope technology is pretty much set and performing the way we want. We have some image preprocessing algorithms that are working well, but we really want to accelerate them and make some further adjustments, so they work faster, better, and more seamlessly. One area where there’s still a lot of work to do is on the analysis of the data; there are so many different, interesting things you can do and structures to segment and different ways to quantify them. That’s kind of a never-ending journey, and we’re just doing the highest-impact analyses at present.

DA: Can you speak more about your collaboration with the Mayo Clinic and how it benefits both organizations, as well as any other partnerships or collaborations that are in place or in the works?

NR: In our partnership with the Mayo Clinic, we’re placing a device in Rochester, Minnesota, within a regulatory-certified CAP/CLIA lab and employing an Alpenglow employee to run the device. We are jointly working with pharma companies to take advantage of the Mayo Clinic’s unbelievable tissue and electronic medical records resources — 25 million tissue blocks from six million patients, all with a searchable electronic database with their clinical records. When a pharma company has a thesis that they’ve developed a drug that may work in a specific situation, we can query the database for that specific disease situation, obtain approval to study the patient tissue and image it in 3D, and then provide a really targeted answer. This is a partnership that we think will be really productive in the future.

Another exciting thing about partnering with the Mayo Clinic is that once we prove our capabilities in a research setting within one of the largest and most respected clinical reference laboratories, the research study and testing can be transitioned to the clinical phase with appropriate validation. There is a truly seamless transition from a high-priority research question, answering with unique technology and unique tissue resources, and disseminating the tests and making an impact.

We’ve also partnered with a lab in Puerto Rico — CorePlus. They’re in a low-resource setting, but they are incredible adopters of new technology on the cutting edge and a really impressive laboratory operation. They have a real area of excellence in prostate cancer, and they work in other areas as well to digitize some of their samples, and they have a similar model to what we’re doing with the Mayo Clinic with potential for clinical translation.

DA: You mentioned earlier that analytics is the area that needs the most work. Are you looking to partner with companies that focus on data to help to build out that adjacency, or is that something that you ultimately want to tackle yourselves?

NR: So far, we’ve been doing it ourselves, because our data is so unique, and other analysis companies are not accustomed to analyzing data in 3D. We’ve had to build a lot of custom software to handle the large data sets and unique shapes and properties of the data. In the future, as we reach a critical mass of users of our technology and distribution of our microscope, we envision that we’ll partner with high-quality AI companies to analyze the data.

DA: I guess that one of the downsides of being on the cutting edge is needing to allow potential partners to catch up to the point where there could be a productive partnership.

NR: One partner on the technical side that we’ve really been happy working with so far has been Nvidia. We are early adopters and heavy users of their GPU technology, and there’s a fantastic array of new products and new capabilities coming up. We’ve pushed the limits of their technology and have reaped the benefits of some of the breakthroughs in GPU computing, both on the edge and in the cloud.

DA: Can you speak about the broader vision and what you see Alpenglow ultimately becoming, as well as the magnitude of the impact you think that a 3D spatial biology approach can have on drug development down the road?

NR: I think that 3D spatial biology will have as big an impact as next-generation sequencing (NGS) did on genetic analysis — moving from looking at a few loci at a time to the entire genome. I think that there’s a nice analogy there for looking at a single 2D section to looking at the entire tissue. Look at how NGS transformed the industry — the testing menus are incredibly extensive now, allowing you to look at so many different things on the same tissue sample.

We think that Alpenglow will be able to distribute our microscope across the world to clinical studies, where tissue biopsies are imaged using our technology to provide a better diagnosis for the patient. After you’ve digitized that tissue — which we’ve shown in publications and in our work with pharma — we can predict prognosis through response to treatment. In the same way that you can sequence a genome and identify where mutations are present, we’ve shown that you can image an entire tissue sample and understand how a tumor is growing and determine which therapy might be a good candidate for that patient. We anticipate distribution clinically, digitization of those pathology data, and the use of that data to predict which treatment the patient would respond to — a situation where it actually benefits everyone in the ecosystem: patient, provider, payer, and pharma. Our ultimate goal is to get complete alignment with all those players in the ecosystem and unlock the full potential of tissue.

DA: Once that transformative effect that you envision manifests, do you see Alpenglow being the only 3D imaging spatial biology company or the leader in a more crowded field of such companies?

NR: I think there’s going to be a benefit to us taking the risk and being the first mover in the field because it’ll be more or less a winner-takes-all situation. We are focused on clinical translation down the road, and there are a lot of barriers to clinical adoption, but if we can figure that out, then we would become a trusted vendor and a trusted partner for hospitals and clinicians. I believe that we’re well-positioned to be the ultimate leader in 3D tissue analysis.

DA: Is there anything you’d like to share about the team you’ve assembled now and the different skills they bring to bear to drive this work?

NR: Yeah, we have a really cool team with a lot of diverse skill sets. We have hardware engineering for building our microscope. We also have a wet lab team who are doing cutting-edge 3D thick tissue labeling and tissue clearing, developing new methods, and validating and making robust protocols. Our data science and software team is a big operation for us because we generate huge amounts of data, and so they’re analyzing these terabyte-size data sets. People are developing new tools that really no one’s seen before, which is another exciting part of our organization. We recently added our commercial team, including veteran hires in 2D spatial biology who are really excited by the ability to look in 3D and have this huge impact. They’re the evangelists, led by our commercial lead Steve Pemberton, and they’ve been incredibly effective at telling that story, spreading the word, and getting people excited to work with us.

DA: Can you speak to the cost of your technology and the considerations in making it accessible to all of the diverse potential users?

NR: At heart, this is a very sophisticated microscope; a piece of capital equipment that is expensive. Most of our users are high-end laboratories in one form or another. But the partnership I mentioned with CorePlus in Puerto Rico is exploring some of those other questions, including how our technology performs and where it needs to improve to be useful to a high-throughput, resource-constrained laboratory. We’ve been working closely with them to explore how to get widened adoptions of the technology. The price could come down, but the more important thing is how streamlined the process becomes. If you can make the laboratory more efficient, the economics follow.

We see a pathway toward ultimately increasing reimbursement for high-value tissue analysis and that’s a way to make the currently unaffordable affordable. Because if you’re adding value to the healthcare system by providing a better diagnosis and better prediction of which therapies will work, you can justify a higher reimbursement. The expense of the equipment has to be viewed in the context of what you’re providing to the healthcare system. MRIs are incredibly expensive pieces of equipment that dwarf our costs, but they’re sold widely because they’re so essential for so many different diseases. I think the same thing can happen with 3D pathology.

DA: As part of that larger vision of transforming pathology, is there anything else on your wish list — additional technologies or innovations outside the scope of what Alpenglow can work on that might also synergize with what you’re doing and help you drive your vision forward?

NR: There are lot of platforms in the 2D spatial market that perform multiplexing with many biomarkers in the same piece of tissue or the same slide. There are inherent trade-offs in that work: the more markers you want, the more time it takes. That’s been an active area of innovation and companies have made great progress, like Akoya and NanoString and 10x Genomics. It would be great to have a slide and be able to stain it with a hundred proteins or a thousand RNA markers, image it in a reasonable amount of time, and digest all that data down. That’s the area that I’m monitoring most closely because they have so many different technologies that can be used with different trade-offs, and people are starting to bend those curves and image more and more markers faster and faster to deliver these new technologies monthly rather than yearly.

DA: Since AI seems to come up in nearly every interview lately, I’m hoping you can share a bit about how you see the potential of the technology and the role it will play going forward.

NR: At Alpenglow, we’re firm believers in a few things around AI. First and foremost, it has to actually solve a problem. There are plenty of ways that you can apply AI, but we are focused on the ones that really add a lot of value. Another thing that we try to do is to have explainable AI as much as possible; in digital pathology, when you can’t explain what an algorithm is doing, there are just so many more risks. We use AI to automate the segmentation of the tissue and compute the features of the tissue. In our final prediction, we use more old-school machine learning, saying: We observe these features, and that’s why we’re predicting that they’ll respond to a specific therapy.

We think that’s beneficial on a few levels. One is that it engenders trust because you can actually figure out what’s going on; the results aren’t spurious. There are lots of great examples in the medical literature. If you update your software or the camera, it won’t cause completely different results.

The other reason that we think it’s particularly valuable is that you can truly learn things, rather than just having a black box that spits out an answer. For example, you can observe that you are starting to see in this disease more vessels close to the tumor in that disease, which can perhaps help with designing a new drug development program or a combination therapy. We try to be pretty thoughtful about how we use AI rather than just doing it because it’s available.

Originally published on PharmasAlmanac.com on July 11, 2023.

Collaboration is Key to the Development of Effective Bioprocess Analysis Solutions to Address Complex Data Needs

The increasing prevalence of next-generation modalities — combined with the unceasing drive to increase throughput, maximize resource use, and reduce cost — has created an evolving need for advanced in-line, on-line, and at-line bioprocess analytics solutions that provide comprehensive data on product quality attributes and critical process parameters without requiring extensive specialized operator training and skill sets. Waters Corporation collaborates with customers and biotechnology providers to develop such solutions and is currently working to bring mass spectrometry fully into both upstream and downstream process development to support their needs.

Across the biopharmaceutical industry, companies are striving to be more continuous, more modular, and more flexible during both development and manufacturing to increase throughput and maximize resource use to reduce the time and cost required. This desire is reflected in the new often closed and increasingly automated single-use solutions being adopted for clinical and commercial production.

Effective process optimization can have a tremendous impact on the manufacturing requirements in terms of number of batches, plant footprint, process time, and many other factors. For instance, high-yielding processes that generate purer products require less downstream purification and can reduce the number of batches that must be produced. There is also reduced risk of batch failure, which can have a significant impact on both scheduling and the costs associated with drug development and production. Equipment, raw materials, personnel, and time must be secured to enable completion of an additional production run. If that is not possible — in the worst-case scenario — a drug shortage could result, leading to negative impacts for the manufacturer in terms of both loss of revenue and brand credibility and, even more importantly, potentially serious consequences for patients relying on supply of their lifesaving medications.

One key to successful implementation of continuous, modular, flexible development and manufacturing solutions is access to in-line/on-line process analytical technologies (PATs) for the gathering of real-time in-process data and new rapid analytical methods and data analysis technologies that provide accurate and detailed information to developers faster and in many cases earlier in the development cycle.

Advances in analytics have come at a slower pace, however. There are definitely more in-line sensors and integrated analytics today that provide basic information and can be used to support limited real-time decision-making, but for many modalities the technologies available today remain insufficient for monitoring complex process parameters, preventing developers from gaining a full understanding of their processes by only relying on limited information, despite the benefits of it being real-time data.

One of the challenges to greater development and use of in-line/on-line analytics is the conservative nature of the biopharmaceutical industry, which tends to rely on technologies that have been previously validated and used. That is changing, however, with the U.S. FDA and other regulatory authorities encouraging drug makers to pursue more modern manufacturing practices and leverage innovative technologies that can increase efficiency, productivity, and quality while reducing cost.

However, new solutions must be easy to use and implement. PAT solutions in particular must be designed for use by operators and other personnel without the need for any extensive analytical training. They must be easy to put in place, not create additional burdens for users, and provide readily interpretable results in a rapid fashion. 

There are both obvious and less obvious benefits of having access to real-time process data that can readily inform decision-making. First, getting answers more quickly helps to accelerate development, reducing the time it takes to bring molecules to market. Gaining a richer understanding of complex processes early on in development using high-throughput lab-scale multi-parallel mini-bioreactor systems combined with advanced PAT solutions helps reduce the number of larger-scale runs required, saving time and money. Better process understanding also derisks later development and commercial-scale manufacturing operations and leads to the development of safer drug products.

Implementation of on-line versions of more complex analytical techniques, such as mass spectrometry and liquid chromatography, has not been fully achieved at this point. PAT solutions must be practical for use in the production environment, which means easy integration with bioreactors in the upstream and with tangential-flow filtration, chromatography, and other purification systems in the downstream. In addition, rapid processing of the data generated by these systems, with results linked back to the processes to provide process control, is essential.

Ultimately, the goal will be to link process parameters to critical quality attributes. Ideally, integrating analytics from sample collection to analysis and then feeding them back into the process will also be possible in the form of validated plug-and-play solutions. Such an approach would eliminate the need for operators and process engineers to understand the analytical techniques involved.

The first steps in this direction are being taken by analytical instrument companies. The BioAccord™ LC-MS System offering and easy-to-use workflows from Waters Corporation represent a significant move in the right direction.  The BioAccord System is the first biopharma solution supported by SmartMSTM, a comprehensive and intuitive set of features that makes high-quality, sophisticated MS techniques accessible to a broader base of scientists and technicians, enhancing uptime and productivity and accelerating decision-making while reducing training needs. Further improvements in integration and ease of use will continue, and Waters is working closely with customers, particularly those involved in early-phase development work, to customize solutions that meet their needs. Eventually, we expect that they will be transitioned from process development to process monitoring for GMP manufacturing as validated analytical technologies. 

Further focused collaborations among solutions providers will also accelerate progress toward the goal of achieving fully on-line implementation of complex analytical techniques. Waters recently announced a new collaboration with Sartorius to develop integrated analytical solutions for downstream biomanufacturing, building on a previous joint agreement focused on upstream analytics. Establishing comprehensive software and hardware integrations between Waters’ PATROLTM UltraPerformance Liquid Chromatography (UPLCTM) Process Analysis System and Sartorius’ Resolute® BioSMBTM multi-column chromatography platform grants bioprocess engineers access to detailed downstream manufacturing data, which can improve yields while reducing waste and costs.

2023_side_Waters

Mass spectrometry is an attractive analytical technique for process monitoring because it can be used to measure a wide range of product quality attributes (PQAs) and critical process parameters (CPPs). The ability to monitor both types of data makes it possible to understand the product outputs and the process inputs and outputs simultaneously.

Currently, in-line/on-line analytical systems measure only specific aspects of a process, such as dissolved oxygen content or amino acid or glucose concentration. Some of these solutions are well designed and work effectively. The Rebel Analyzer from 908 Devices is a good example: a simple-to-use kit that quickly provides data about amino acid content. Wyatt’s (now Waters) multi-angle light scattering (MALS) solution for downstream aggregate analysis is another. Raman probes and affinity probes have also demonstrated real value in a range of applications, providing specific information quickly and accurately.

Liquid chromatography–mass spectrometry (LC-MS), however, is far more comprehensive.  While LC-MS technology can be complex, current commercially available PAT systems, like the BioAccord System, are easy to deploy in bioprocess environments, simpler, and much cheaper. As a result, they enable faster decision-making than can be achieved today using off-line LC-MS methods and represent a far more accessible and useable solution.

Careful thought must be invested in establishing a strategy for incorporating on-line/at-line analytics and advanced off-line techniques, such as LC-MS, into overall workflows to ensure both efficiency and access to all the data necessary for making informed decisions during process development, scale-up, and GMP manufacturing.

Early in development, rapid, high-throughput screening of large numbers of samples is needed. During late-stage development and GMP manufacture, robust, reliable validated methods are crucial. The earlier in development, the more opportunities there are to make changes and adjustments to analytical techniques. Once processes are locked down and analytics have been validated, making changes becomes very difficult.

There is generally a trade-off between the amount of information desired and speed. At the research stage, the goal is to collect as much data as possible, and time is therefore more available for this activity. Analytical techniques used in a GMP production environment must be compliant, robust, easy to use, and as rapid as possible yet provide the data necessary to assess CPPs and PQAs and prevent batch failures. Ideally, they also help increase the sustainability and reduce the cost of bioprocesses.

There are also different needs for upstream and downstream analytics. Upstream, complex mixtures must be analyzed, typically without any cleanup. With MS, these mixtures can lead to ion suppression and interference from the sample matrix. The key, therefore, is to determine what quality of data is needed to allow operators to make informed decisions, rather than looking for the best data quality that is possible.

It is also worth noting that LC-MS solutions may entail higher upfront investment but can replace large numbers of different sensors and in the long-term lead to an overall reduction in cost. Furthermore, rapid analytics today are not sufficiently robust to be used for batch release. On the other hand, the robustness, precision, and accuracy of MS instruments designed for use in bioprocess environments are continuously increasing. And unlike ELISA assays, which provide indirect measurements, MS directly measures the target molecule of interest, which affords both considerably higher accuracy and a higher degree of trust in the results.

As ever more complex analytical techniques are successfully designed for use in the bioprocess environment, they are creating the next challenge to truly leveraging rapid methods: data management. The quantity of data generated today is massive, and each instrument maker and end user manages data in slightly different ways. Key decisions must be made regarding which data are valuable and who will process those data.

For instance, approximately 200 different components in cell media can be monitored in a bioreactor. In any one bioprocess run, 80–100 of those components are typically tracked, with many samples collected daily over multiple weeks. Fully analyzing those data alone could take six months. It is essential to determine what data is needed versus what is merely nice to have — and focus on the data that pertain to CQAs and CPPs.

This issue is further complicated by the bottleneck associated with the difficulty today of finding skilled R&D and manufacturing personnel. There is a shortage of skilled and trained personnel across the biopharmaceutical industry. Increased automation will help address this problem to some degree, but even for highly automated manufacturing plants, skilled operators and engineers with a deep understanding of the processes involved, including analytical experts, are essential to oversee those operations. Thus, the drive is to develop automated analytical LC-MS tools that can be used in a “walk-up” fashion, where the operator does not need to be an expert in mass spectrometry but can still collect advanced and comprehensive MS data rapidly and in a standardized manner.

Waters Corporation recognizes that, for the company to be successful, our customers must be successful. Consequently, the main focus is on providing customer support. That begins with innovating in the analytical space to develop instruments, data processing and management software, and workflows that address their unmet needs, which we determine in close collaboration with our customers, and continues with installation, education, and training assistance to ensure that they maximize the benefits of Waters technologies.

Our bioprocess collaborative program is a prime example. In such collaborations, scientists can gain access to Waters technologies before they buy them to determine if they fit their needs. After a customer purchases a Waters instrument, such as a mass spectrometer or chromatography system, we provide ongoing support to ensure that they are positioned to use those solutions to their full potential. We help them get started using the instrument and are available to provide assistance down the road if needs change. Overall, Waters’ customer success programs add real value beyond the performance of the technologies themselves.

Other collaborative partnerships are focused on identifying new ways in which customers can use Waters technologies to resolve issues they are facing — for instance, looking to see if there is value in using mass spectrometry to solve a specific analytical problem. If there is, then we work closely with the experts dealing with the issue to understand how we can best provide a solution for that challenge.

This approach also helps to address the hesitancy in the biopharmaceutical industry to adopt new technologies, as education is key to increasing awareness and understanding. Closely collaborating with customers facilitates knowledge transfer and helps lead to more user-friendly interfaces that are not intimidating, which reduces the fear and wariness that many engineers and operators not trained in advanced analytical technologies have about using mass spectrometry and other techniques that can, if properly deployed, dramatically improve process performance.

Mass spectrometry is becoming ever more important in the biopharmaceutical industry as the number of cell, gene, mRNA, and other next-generation therapies in the clinic continues to rapidly climb. While there is a lot of knowledge and experience surrounding analytics for recombinant proteins and monoclonal antibodies, those solutions do not necessarily transfer readily to these new modalities. Characterizing large viral vectors and differentiating between full and empty capsids for gene therapies or analyzing very small sample volumes for personalized treatments all require new analytical technologies.

Waters Corporation and other analytical instrument companies are evolving promising new technologies to meet this need. At the same time, Waters continues to focus on making existing technologies, particularly mass spectrometry given the invaluable data it can offer, easier to use for non-experts so that they have more relevance in the GMP production environment and not just in R&D labs. That includes automated sampling systems that maintain an aseptic environment, more user-friendly software, simpler sample preparation protocols, and overall smoother workflows. The goal is to offer comprehensive, effective, single-vendor bioprocess analytics solutions that solve customer problems while also increasing convenience and ease of use.

Originally published on PharmasAlmanac.com on July 10, 2023