Tackling production inefficiencies, faulty products, and costly machine maintenance are just a few of the many issues that are hardly manageable with a legacy approach to pharmaceutical manufacturing. A combination of human know-how and artificial intelligence (AI), big data technologies, and advanced digital infrastructures (e.g., cloud systems, analytics-as-a-service, etc.) — better known as Pharma 4.0 — is transforming the entire Life Sciences sector. Pharma 4.0 is the new manufacturing era, focusing on therapeutics in a digital-first environment.
To get a better understanding of what technologies and processes are driving the transition to next-generation manufacturing in pharma and biotech, we have sat down with Rajiv Anand, founder and CEO of Quartic.ai (www.quartic.ai), and discussed the past, present, and bit of future of pharma 4.0.
Quartic.ai is a developer of an enterprise-scale AI platform – the Quartic Platform™ – that makes autonomous manufacturing a reality by enabling comprehensive real-time insights into the entire manufacturing process.
Andrii: Can you share your journey into industrial life sciences?
Rajiv: I have been engaged in the control and automation of process manufacturing, including life sciences manufacturing, for over 35 years. I have seen the industry evolve from basic instrumentation and control to sophisticated batch execution, advanced process controls, MES systems, SPC, informatics, and manual data analytics.
While there still are many instances of manual workflows and paper-based data records, the industry has become “data-rich” over the past two decades. This provides an excellent opportunity for advancement in informatics to improve all stages and all aspects of industrial life sciences for the benefit of the patient. But analysis of this data to create meaningful and valuable intelligence has mainly remained manual or human. This means that even when sophisticated and powerful analysis tools are used, it is primarily performed by humans. Analysis workflows for the same problem-solving are often repeated, and analysis generally has a narrow perspective – like a single process unit at a time. Almost all analysis is also retrospective in that we learn things from analysis after the fact and then look for corrections and improvements in the future. Informatics and analysis are, therefore, highly inefficient and not very useful in real-time.
With the increasing move towards biologics and biomanufacturing, and as molecules become more complex, so is the complexity of the multivariate relationships in the data related to these molecules. Sophisticated analytical instrumentation such as spectroscopy and near-infrared analyzers are a lot more applicable and starting to become pervasive.
Creating intelligence from this increasingly complex data with a “zoomed-out” view of the entire process, and in a real-time and predictive manner, so that auto-correction of the process is taking place all the time, is what’s required to move forward. But existing, legacy technology has hit its limit to accomplish this.
Overcoming these limitations was my motivation behind starting the Quartic.ai journey, and building systems that overcome these limitations is the most foundational aspect of “Pharmaceutical Industry 4.0.” The journey has been exciting, but the future is exponentially more exciting.
Andrii: They call it “pharmaceutical industry 4.0” or, more broadly, industry 4.0. In simple terms, it is about automating industrial lines and processes, primarily using innovative technologies like IoT and artificial intelligence. But can you share a bit more of what is under the hood? What is pharma 4.0, and where are we now?
Rajiv: We have been automating industrial lines and processes for over five decades using industrial control systems like PLCs and Distributed Control Systems (DCS) – also referred to as Operational Technology or OT. You could call that Industry 3.0. Industry 4.0 or Pharma 4.0 is more about achieving autonomous manufacturing. And while Industry 3.0 (automation) was about automating manual tasks, Industry 4.0 is about automating cognitive tasks, performed mainly by humans now (with AI and machine learning).
Automation with OT serves a particular purpose and has allowed us to produce better drug substances more efficiently.
IoT is receiving much attention and hype. From an industrial automation perspective, I don’t consider it the Industry 4.0 driver, even though it is an enabler. From an Industry 4.0 perspective, IoT is really the OT system that has already existed. Of course, we can now add a lot more sensors for additional measurements because IoT innovation for other sectors is making sensing more affordable.
What is under the hood is machine learning – a subset of AI. Machine learning has become the key tool for automating cognitive tasks for Industry 4.0. In simple terms, with machine learning, we build models or intelligence agents that perform the multivariate analysis automatically and provide the results in real-time so that actions are taken from those results. We then take a further step by predicting outcomes with these models. Suppose we know that the predicted outcome is not desirable in terms of yield or quality and the factors causing the predicted outcome to be undesirable. In that case, we take corrective action proactively to avoid a less-than-ideal outcome.
We then take the ultimate step. Rather than predicting outcomes and taking proactive action, we optimize the outcome directly. In the process of always trying to optimize the outcome, we are back-calculating (if I can use that term a bit loosely) the factors related to an optimal outcome. That is what an autonomous system that defines Industry 4.0 is. The word autonomous still generates a lot of apprehensions, but it really should not.
Alternatively, we can call it a continual optimization system!
In essence, we are just building a next-generation OT system – it combines elements of legacy OT and IT and adds AI. I prefer to call it an Operations Control System. We have had Process Control Systems (OT) and Manufacturing Operations Management Systems (MoM – using Batch, MES systems, informatics, and analytics). When we use OT systems and add cognitive automation to operations management, we end up with an Operations Control System.
In simple terms – Process Control Systems (OT) control process variables - an Operations Control System with automation of cognitive functions controls operations outcomes. That is Industry 4.0 or Pharma 4.0 in a nutshell.
Of course, it takes a lot more under the hood to build such a system to function securely, reliably, and deterministically. While AI/machine learning provides intelligence, an entirely new operational data system must be built. This data system must be capable of “feeding” machine learning algorithms instead of human eyeballs that have been the consumer of data for analytics. The data system must also be able to consume the output of machine learning models, and it must be able to distribute the intelligence to functions and applications on the manufacturing floors and labs (on-premises) as well as in the cloud. And for regulated industries like life sciences, it must also meet GxP and CSV (Computer System Validation) and electronic record management requirements of CFR21 Part 11
Andrii: What does Quartic.ai do? What is your unique niche in the market?
Rajiv: We have built the Operations Control System I described and are deploying it in life sciences manufacturing with GxP and CFR21/11 compliance. Of course, a system or a platform alone does not in itself deliver value, so we have built very specific applications for the life sciences industry for every step – from product development, tech transfer, scale-up, manufacturing, quality, and predictive maintenance/reliability. This allows the transition to Industry 4.0/Pharma 4.0 very rapidly to create a flywheel of value and ROI.
Our uniqueness can be described in three key areas:
First, we have built the most cohesive and coherent platform that works reliably from day one and at scale from a single process unit to multiple sites of an enterprise – there is a plethora of single applications and DIY systems in the market – they don’t work cohesively or scale cohesively together and have kept adoption expensive and slow.
Second, most machine learning is founded on big data, and making those algorithms work effectively in manufacturing applications is impractical, even impossible. This big data must also have sufficient variance in it for training algorithms. Particularly for life sciences, this big data from hundreds of thousands of batches of a particular product with needed variance is seldom available. For earlier stages like PD, where the opportunity for value creation is high, data availability is even more scarce. We have developed specific data (sample) efficient algorithms to overcome this challenge. We have also made these models highly explainable to increase trust and address the validation requirements for GxP.
Thirdly, we provide a comprehensive set of ready-to-use applications for the life sciences industry for each stage (PD to final release) and aspect (development, quality, yield, equipment reliability) to accelerate deployment. When users start their digital transformation/pharma 4.0 journeys with traditional hyper-scale services or “platforms,” after considerable effort and expense, they often ask, “what use cases?” This is the wrong way to go about it, and this is what leads to frustration and the “pilot purgatory.” With our approach, use cases don’t need to be imagined. We know how we develop and manufacture life sciences products, the steps and the workflows required, and the problems and bottlenecks at each step. With our approach, you deploy one of the applications for the step or workflow you want to improve and build a complete pharma 4.0 enterprise.
Of course, not everyone can be in a position to implement the complete pharma 4.0 Operations Control System that I have described – particularly when you have legacy manufacturing in your portfolio. We allow customers to digitalize specific applications, processes, or workflows. With the big picture of pharma 4.0 built into our system, customers not only achieve accelerated ROI, but it ensures that the investments they make become the building blocks for a complete Operations Control System for the future.
Andrii: It seems like AI is the central technology in your company’s product offering, and in the whole of pharma 4.0, for that matter. Can you outline the significant advances in the tech field that made the digitalization of pharmaceutical manufacturing come of age?
Rajiv: Machine learning is indeed the central technology, but to build and consume the applications that make pharma 4.0 possible, two other components are the backbone. Firstly, the operational streaming data system I described earlier – data from OT, MES, LIMS, and ERP systems must be prepared in the context of the equipment, product, recipe, and materials. The data and this context must flow together in the pipeline that feeds the machine algorithms (DataOps) and consumes their output. The second aspect is what is called distributed computing which is event-driven. It combines machine learning models (MLOps), complex event processing (CEP), and microservices. Then you must integrate all this to work cohesively and reliably at scale. This requires advanced DevOps. So, when you look at a complete pharma 4.0 system, AI and machine learning become a small percentage of the technology stack.
In terms of technological advances that have made this digitalization come of age, they are distributed computing, MLOps, microservices, and DevOps. Because these functions are required in almost all AI applications at scale, these technologies are advancing exponentially. This is of great benefit for pharma 4.0 because manufacturing and pharma manufacturing traditionally lag in technology adoption.
Andrii: What is the future of intelligent pharmaceutical manufacturing? Let’s say, the next three years? Next ten years? What will, most likely, emerge in the field that is not available today?
Rajiv: In the near term– three to five years – we should expect product and process development workflows to be automated and digitalized. Tech transfers will also become much more digital. This is partly because of the more significant number of substances being developed, the need for speed to market, and the increasing shift to contract to manufacture. In these areas, necessity is driving the speed of adoption. Most multivariate data analytics (MVDA) systems currently in use are long-in-the-tooth. The increasing need for MVDA and the lack of resources are increasing the need to automate these analytics with AI. We should see this automation and its adoption become pervasive in the next three to five years.
On a 10-year horizon, we should expect some impactful developments in using autonomous systems. With the combination of PAT, small-data AI, and increased robustness of protocols like OPCUA, we are ready today for closed-loop control. This will unleash a widespread use of autonomous bioreactors and many other downstream processes as well. The other much-needed technology that has been “stranded” for several years is continuous manufacturing – particularly continuous biomanufacturing. In my belief, AI and pharma 4.0 systems will not only make it possible but ubiquitous.
Given how rapidly technology is evolving, it isn't easy to think of what will emerge that is not available today. Still, causality-based algorithms are one key area that I can think of that will be a game-changer.