Advancing technologies unlock solutions for biopharmaceutical development. By Cheryl Scott, Senior Technical Editor at BioProcess International
Quality may have been in the name of only half of a single track this year, but judging by the most-watched presentations at the first virtual BPI Conference and Exhibition, it was on top of nearly everyone’s mind. In particular, attendees were interested in how modern information and analytical technologies can help their companies deliver on the quality–safety–efficacy promise that good manufacturing practice (GMP) regulations require. Cheryl Scott, cofounder and senior technical editor of BioProcess International magazine, explores some of the takeaways from key sessions.
That is perhaps most evident in well-received keynote addresses by Anthony Mire-Sluis (head of global quality at AstraZeneca), Jeff Baker (deputy director of the Office of Biotechnology Products in the US FDA’s Center for Drug Evaluation and Research), and Nadine Ritter (president and analytical advisor at Global Biotech Experts, LLC).
Mire-Sluis gave a visionary presentation. In “Digitally Uplifting Quality to Advance Product Supply and Create Competitive Advantage,” he described a biopharmaceutical development “laboratory of the future” with advanced analytics and mixed-reality technology, fully digital laboratory execution and documentation systems, standardized automation and “internet of things” connections, and even simple handheld near-infrared (NIR) devices for identifying raw materials and final drug products. He envisioned the use of artificial intelligence in data mining and trending as well as standard operating procedures (SOPs) continually updated in real time and accessed by tablet rather than printed and circulated on paper.
Mire-Sluis highlighted some specific evolutions in quality/analytics, as companies have transitioned from classic approaches to faster, more accurate methods that qualify as process analytical technologies (PAT). For example, wet chemistry assays once were the norm for raw-material identification/qualification. Now companies have the option of handheld Raman spectroscopy for nearly instant results. Similarly, rapid and accurate microbial detection and identification are possible using matrix-assisted laser-desorption ionization mass spectrometry (MALDI-MS) instead of the more basic fluorescence, turbidity, and colorimetric results offered by first-generation optical systems. And slope spectroscopy instruments familiar to readers of BioProcess International 1–3 have supplanted the traditional cuvette-based means of protein-concentration measurement.
Finally, Mire-Sluis quoted the 19th-century English art critic, philosopher, and philanthropist John Ruskin: “Quality is never an accident. It is always the result of intelligent effort.” Modern information technology can enable biopharmaceutical companies to apply that tenet to the realization of their drug products and manufacturing processes.
The philosophy was reflected also in both of Baker’s well-attended presentations: “Discuss, Deploy, or Defer: New Technologies in Real-World Biopharmaceutical Manufacturing” and “Specifying the Spokes and Handicapping the Hub: Management of Residual Uncertainty in Biopharmaceutical Development and Manufacturing.” BPI Conference attendees are often eager to find out what FDA speakers have to say, and Baker made it clear early on that he wasn’t there simply to read aloud passages from regulatory documents anyone can find online. He wanted to talk about risk management and how advanced technologies can help companies do it.
“FDA has encouraged deployment of advanced manufacturing tech for 20 years,” he reminded us, highlighting a number of programs and partnerships involving organizations such as the National Institute for Innovation in Manufacturing Biopharmaceuticals (NIIMBL). It’s all part of the agency’s 21st-century focus on management of residual uncertainty through probabilized outcomes and totality of evidence in drug development.
There’s a difference, Baker pointed out, between avoiding uncertainty and managing it. We must accept that uncertainty is neither good nor bad, but just an attribute of biological systems to be understood the best we can. He encouraged attendees to differentiate between mere data analysis and real knowledge management, between mere statistical calculation and statistical thinking. “Make clinical relevance central to risk assessment. Optimize value rather than minimizing cost. Prioritize value understanding over specification.”
When companies consider a new technology and whether it should be discussed, deployed, or deferred, Baker said, “it depends on what risks you’re managing.”
As to what regulators are looking for in their inspections and reviews, he emphasized the totality of evidence approach. “There is no simple checklist. It’s not simply red–yellow–green, score these things, and if they add up to less than 27 then you’re good.” Managing residual uncertainty requires true risk assessment. Developers must identify relevant risks, assess their likelihood of occurring, and identify necessary activities for mitigation — all based on multiple sources of information. “You need to make a sound scientific argument based on effective engineering practices that (a) you know what you’re doing and (b) the outcomes you want are very likely.”
Baker brought up fingerprint analysis, often imagined in TV crime dramas as an exact overlay of identical images. In reality, however, forensic analysts compare a number of unique points on different marks, and if all those align, then the prints are deemed to be from the same person. “This is what we do in biotech,” he explained. Specifications, stability testing, qualification, failure modes and effects analysis (FMEA), comparability, and continuous improvement all measure a number of aspects and compare them using risk management to determine whether or not results match. “Fingerprint-like biosimilarity” is better for saying something is not the same than for saying it is so, Baker emphasized. He hammered his points home using quotes from sources as far-ranging as Bruce Springsteen and Lao Tzu — all in support of the holistic approach to both risk and quality based on solid science.
BPI editorial advisor Nadine Ritter got into the nuts and bolts of it in “CMC Data Integrity and Quality Practices in Process and Analytical R&D Lab: Risk and Benefits.” She explained how data system design and controls should enable easy detection of errors, omissions, and aberrant results throughout a product lifecycle. Ritter spoke of the “sliding scale” of phase-appropriate GMP compliance — and what kinds of studies can and can’t be performed in non-GMP laboratories during drug development. And she helped attendees differentiate between GMPs and good laboratory practices (GLPs), which apply specifically to all nonclinical studies related to product safety — not merely a “subset of GMP.” That applies to manufacturing, control, and testing of materials that will be used in humans, and 21CFR11 applies to data generated in all laboratories supporting drug development.
Ritter highlighted three pillars of “good R&D quality practices” for information technology: quality (ensuring that data generated are technically appropriate and scientifically meaningful), integrity (ensuring that documentation is clear and complete, and that it represents adequately the activities that were conducted) and safety (ensuring that all documentation is traceable to its source, retrievable on demand, and secure against loss or damage).
Many other speakers presented results from their own experiences using new and emerging technologies. Here, too, product quality was often the focus. For example, Yelena Ilin (Sanofi) detailed a “Mechanistic Investigation and Characterization of Product Quality Variability in a Cell Culture Process,” in which her team used a cell-free system to investigate the formation of low–molecular-weight product variants in a production culture. Using that tool to mimic those aberrant results, Sanofi researchers could study the specific factors without confounding impacts on cell growth or productivity.
In “High-Resolution Optical and Dielectric Methods for Monitoring Cells in Bioprocesses,” Michael Butler (principal investigator in cell technology at NIBRT in Ireland) highlighted new PAT technologies for upstream process monitoring and control. “Optical and capacitance methods are robust, inline measurements of cell viability and growth,” he said. Cell deviations of different types occur during loss of viability, from early apoptosis often detected by an enzyme-linked immunosorbent assay (ELISA) to late stages involving membrane damage identified by trypan blue dye exclusion. Upstream process engineers know that early detection is always best, and the new instruments measure early stage apoptosis. Butler finally touted dielectrophoresis (DEP) cytometry as a markerless, electronic single-cell detector that can identify subpopulations of cells during apoptosis events.
Stepping back from that microscopic view, David Lee (upstream cell culture process research associate at Seattle Genetics) described an “Evolving Cell Culture Platform to Address Amino-Acid Misincorporations for Fed-Batch Processes.” His team observed amino-acid misincorporation in the data from new products made using an established platform. The group discovered that was attributable to increases in growth and product titer that caused cells to “starve” as nutrients depleted too quickly in culture. So SeaGen modified its platform process to maintain acceptable product quality. Development of a platform based on chemically defined media provided a long-term solution: “We are using continuous improvement of the cell culture process to ensure that the platform is robust for different phenotypes.”
On the downstream side, Brian Murray (biologics purification development scientist at Sanofi) presented a “Fully Automated Platform Approach to First-in-Human Purification Development: MAbs and Beyond.” His company used automation to intensify its platform purification development approach with minimal hands-on time of under a day. The group plans to work on creating flexible tools for platform experiment execution with integrated analytics to measure, for example, host-cell proteins through surface-plasmon resonance and product variants through size-exclusion chromatography.
David Kahn (vice president of biopharmaceutical development at Macrogenics, Inc.) encouraged attendees to consider following the example of the “Use of Developability Assessments at a Midsize Biotechnology Company.” He described this product-focused work as “a critical activity occurring at the interface of research and development,” although R&D often are spoken of in the same breath. “These assessments minimize the risks of poor performance in the clinic and increased costs and complexity in the production of viable clinical/commercial drugs,” Kahn explained. “Macrogenics has prioritized those assessments of highest value and continues to consider adding new technologies/capabilities to our toolbox.”
Such assessments are important when it comes to processes as well as products, as Andrew Sinclair (managing director of Biopharm Services) showed in “Maximizing the Impact of Process Intensification: Assessing Implications of Scale and Technologies.” To illustrate his message, he focused on a continuous downstream processing operation. Using proprietary analysis software, he showed how reducing the number of batches run per week will save on cost of goods (CoG). Those benefits increase with scale: maximizing chromatography resin capacity while minimizing consumables use (and number of changeovers) and reducing process complexity while simplifying and integrating technologies. But he cautioned viewers that there is “no single simple solution.” Continuous processes must be optimized like any other operations.
“Economic analysis provides insights to help you find the best technology options and configurations and rationale in support of decisions you make,” Sinclair concluded. “This helps you focus limited researches to maximize value by setting baselines and providing a way to measure progress.”
The pandemic has forced people in many industries to jump quickly into using new technologies. As travel was restricted, virtual meetings and events have become the new norm around the world. Many of us have discovered the value of on-demand review capabilities after events like the BPI Conference for brushing up on topics and checking details we may have missed. And even though we all miss the fun of live events, we are making the most of what modern information technology offers to help us do our work — not to mention, visit with family and friends we otherwise might never have seen at all this year!
Meanwhile, emerging options in PAT, automation, and data management are changing how work is done in bioprocess development and biotherapeutics manufacturing. Interest in information technologies and other advances ran high among those in attendance this year, with the increasing focus on speeding products through development providing the impetus — whether those candidates are intended to address the pandemic or not. We know what regulators want: quality, safety, and efficacy of all biopharmaceuticals for market. The details of how best to achieve those goals as quickly as possible — and without compromise — are increasingly digital. But biological principles still reign.