Camo blog

Sharing our passion for industrial analytics

Insights, inspiration and reflections from Camosapiens.

Charles E (Chuck) Miller:

Process Analytical Technology (PAT) in Pharma

In a series of blogs, I will provide a high-level retrospective look at Process Analytics, and the wide array of value propositions that it provides for various manufacturing industries. I will offer some experiences from various on-line applications that span several industries and in some cases have been in service for several decades. It is safe to say that some of these value propositions were expected, while several of them were unexpected. And finally, realizing that Process Analytics is not without cost or risk, these same experiences will be used to offer some “Best Practices” for Process Analytics – covering seven distinct categories.

In my previous blog, I covered the history of PAT and its key enabling technology of multivariate analysis (MVA). Today’s discussion will focus on the introduction of PAT to the pharmaceutical industry, and some of the challenges that PAT faces in that industry.

In 2003, a review of the state of manufacturing in the pharmaceutical industry published in the Wall Street journal [1] came out swinging with the very first sentence:

“The pharmaceutical industry has a little secret: Even as it invents futuristic new drugs, its manufacturing techniques lag far behind those of potato-chip and laundry-soap makers.”

Ouch!

Needless to say, this report laid bare all the various ways that medicine manufacturing lagged behind other industries, citing the rising number of medicine recalls, inefficient supply chain management, and reluctance to utilize new manufacturing and quality assurance technologies. Not coincidentally, in the following year, the US FDA issued its PAT Guidance [2], as well as its Risk-Based Approach to cGMPs report [3], which in essence laid the regulatory groundwork for encouraging increased usage of PAT in the pharmaceutical industry. Since then, it could be said that PAT development and usage in the non-pharma and pharma worlds followed quite different trajectories:

  • For most non-pharma industries, PAT development and practice continued along the existing trajectory, towards more mature technologies for all key PAT attributes: including robust and representative sampling interfaces, robust instrument systems, calibration model development and management, and data handling and management systems
  • Meanwhile, in the pharma industry, PAT started on a much more gradual and deliberate journey from an obscure “niche” interest to some limited applications – in some cases borrowing and adapting advances that were driven by the other industries.

Several reasons for the slow acceptance of PAT in pharma have been offered, with stricter regulatory requirements and oversight often cited. It is true that PAT was essentially a “new” technology for the pharma industry in 2004, and it is fair to expect any “new” technology to encounter skepticism and caution. However, I think that it is disingenuous and unproductive to assign regulatory risk as the only reason, or even the main reason, for PAT’s lack of acceptance in pharma, due to the obvious irony: It was, after all, a regulatory body (US FDA) that had coined the acronym “PAT” and encouraged its use in the first place! Instead, it is my belief that a combination of factors contributed to this situation, many of which are fully controllable by the pharmaceutical companies themselves. The factors I would like to cite are provided below:

  • Expected Challenges:
    • Organizational Inertia: Years of operating without the use of process analytics likely produced well-established organizations with deeply entrenched workflows. In this scenario, PAT is often perceived as a disruptive force.
    • Misconceptions: To newcomers, the math and science of PAT and MVA can be somewhat challenging to understand. To many, this prompts the desire for more understanding, but to some it prompts suspicion and avoidance.
    • The “Disagreement Narrative”: In most new PAT applications, each of the PAT outputs has at least one legacy sensor or off-line measurement that generates the same nominal output. , Although the addition of a PAT method presents an opportunity for improved process and product characterization through redundant measurements, many in the pharma industry actually look at this as a risk factor: citing potential disagreement of the PAT method results with the legacy methods as a reason to forgo PAT altogether.
    • Operational Concerns: If, during GMP manufacture, the PAT system encounters utility issues, and needs to be taken out of service for some time, one must have a continency plan for avoid supply disruption during these times. In the worst case, if PAT utility and performance issues become chronic over time, what are the regulatory and operational consequences of taking the system out of service for repair?
    • Organizational Challenges: From an organizational perspective, PAT is a rather unique function, in that it requires significant collaboration with many other functions. These include R&D, Analytical, Engineering, Operations, IT, Automation, and Quality Assurance. Where does the PAT function belong in the organization? In R&D? In Manufacturing? In Analytical? In Engineering? In Central Services? This might seem trivial, but not having a strong “organizational home” can greatly hinder any function’s mission and influence in a company- and PAT is no exception to this rule.
      _
  • “Elephants in the Room”:
    • Technical Chauvinism: From my personal experience, I’ve seen this reflected in many forms, including the following:
      • Over-confidence in engineering controls and mechanistic process understanding, combined with under-confidence of PAT capabilities,
      • Proposed higher validation burden for PAT methods, compared to legacy laboratory analytical methods, given the same method impact,
      • Proposed lower validation burden for engineering-based mechanistic models compared to more empirical PAT calibration models,
      • Proposed higher scrutiny of PAT data handling/analytics systems versus legacy process historians and automation systems
    • Regulatory Fear: To a pharmaceutical company, having a New Drug Application (NDA) dossier rejected is a serious business setback, for many reasons. This, when combined with a concern of regulatory overreach (whether fair or unfair), often leads to a very conservative approach in preparing NDAs. In this context, some see PAT as a risky addition to the dossier, in that it is a relatively “new” technology that might draw extra attention and scrutiny.

For those of you who think I am being too harsh for painting the above picture, I offer the following: In 2009, the US FDA formally advocated Juran’s principles of Quality by Design (QbD) [4] for pharmaceutical development [5], in which PAT was prominently cited as key component of the QbD approach to process and product development. Ten years later, the general consensus is that PAT has only seen limited use in development, and many in the industry still argue that it should not even be considered for use in manufacturing. This position is, of course, counter to longstanding practice in other industries, as described previously. There certainly are risks and costs associated with PAT when used in commercial manufacturing, and these will be discussed in detail later. However, these other industries have also known that PAT’s value does not need to end at product launch, and in fact it has the unique ability to “lock in” the enhanced process understanding that had been obtained during development [6].

Despite the challenges mentioned above, there have been several valuable PAT applications in pharma development, and there are several notable PAT applications in pharmaceutical manufacturing, including a Real Time Release Testing (RTRT) method that has been in service since 2006 [7]. Furthermore, the more recent emergence of Continuous Manufacturing (CM) in the pharmaceutical industry is expected to drive many more valuable PAT use cases in the coming years, as is evidenced by the first successful filing of a CM-manufactured product in July 2015 [8].

Finally, to complete this post on a more optimistic note, I would like to counter the list of PAT challenges above with the following list of mitigating factors.

Training: PAT and MVA training courses are a very effective means of improving working knowledge, familiarity and comfort level with these technologies. Considering the diverse organizational landscape of PAT cited above, it is important to note that the objective of such training need not be to become another PAT or MVA expert: it can be just as effective to gain sufficient familiarity with the technologies in order to improve interactions within your own project teams.

Equivalence Testing: The potential disagreement of two analytical methods is hardly a novel concept, and it certainly shouldn’t be grounds for PAT rejection. There are long-standing statistical tools that can be easily applied to test for method equivalence – both during initial method validation and during ongoing method verification.

Theory of Sampling (TOS): The “disagreement narrative” mentioned above presumes that the disagreeing methods analyze the identical physical samples, which, of course, is never the case. This fact also underscores the importance of fully understanding the sampling distinctions between analytical methods, and their potential impact on measurement results. The use of well-established Theory of Sampling (TOS) principles [9] in the pharma industry [10] can lead to more representative and reliable sampling, thus reducing the likelihood of such disagreements.

Analyzer Reliability: PAT instrument utility is indeed a valid concern, but it would be very unfair to the instrument manufacturers to ignore the fact that instrument robustness and reliability have improved greatly over the past 10-20 years. In the non-pharma industries, the well-established field of Reliability Engineering [11] has had a positive impact on system utility rates in these industries- and it can have a similar impact to PAT systems in pharma. For critical PAT systems, special service plans can be arranged with the vendor(s), and newer-model instrument systems allow effective remote support on short notice, with sufficient security measures in place.

Unique QA For PAT: The use of multivariate calibration in many PAT applications actually provides unique additional QA opportunities for such methods: namely, the use of outlier diagnostics (ODs) to assess the degree of abnormality of every single PAT measurement profile (spectrum) in real-time [12]. This would be like a traditional laboratory analytical method that provides both the analytical result (ex. API concentration) and an assessment of the quality of this result, for every analysis cycle. How many other analytical methods can claim this type of QA?

Value of Redundancy: Redundancy has been widely embraced in other industries as a means to improve system control, utility, and safety- and there is no reason why it can’t be just as valuable in pharmaceutical manufacturing. There is indeed a cost for redundancy, in that one must prepare contingency plans for cases of disagreement between redundant methods, or unexpected downtime of one of the redundant measurement systems. However, to me it is no coincidence that measurement redundancy and generous usage of PAT was clearly embraced for the first successful FDA filing of a continuously manufactured product [7].

“All models are wrong….”: “…but some models are useful.” The fact that this heavily cited quote of George E.P. Box was made in 1976 [13], well before the “PC Revolution” made MVA accessible to the masses, is a testament to its enduring relevance. I like to use it as a retort to those who attempt to differentiate the virtues of models based on their category: for example, first principles models vs. mechanistic models vs. empirical models vs. hybrid models. All of these model categories share a common characteristic: that they are not perfect- and therefore I argue that the model category should have no impact on the recommended validation, testing and QA burden of models in a regulated environment. Instead, I recommend that said burden be based on the criticality of the intended use of the model, along with the risks associated with the assumptions behind the model, regardless of model category. This leads me to my final mitigating factor…….:

Transparency: A common criticism of highly empirical MVA models is that it is difficult to understand at a practical level “what they are” and “what they do”. In fact, this was a concern of mine when first learning about these methods years ago, and it was often exacerbated by some software suppliers- who treated MVA model development as a “black box” exercise. When Unscrambler first came out in 1985, special care was taken to provide visibility to a vast set of PCA and PLS model parameters- to allow the user to “look under the hood”. However, it was just as important that this visibility was accompanied by a strong philosophy that domain knowledge should always be utilized in MVA [14], and it was further supported by various workflows for interpreting MVA model parameters [15] to improve transparency.

Camo Analytics continues to be at the forefront of providing tools and services to support many of the above mitigating factors for MVA and PAT: including MVA and PAT training, DOE and statistical analyses via StatEase’s DesignExpert, and strong visibility into MVA models generated in Unscrambler, to support both model QA and model transparency. Furthermore, the seamless transference of these features into the on-line analytics domain is now greatly facilitated through the Process Pulse on-line platform.

My next blog entry will cover the use of PAT in industries other than pharma, as well as some of the expected – and unexpected – benefits of using PAT across industries. See you then!

[1] “New Prescription For Drug Makers: Update the Plants – After Years of Neglect, Industry Focuses On Manufacturing; FDA Acts as a Catalyst”, Leila Abboud and Scott Hensley, Wall Street Journal, Sept. 2003.
[2] “Guidance for Industry , PAT — A Framework for Innovative Pharmaceutical Development, Manufacturing, and Quality Assurance”, U.S. Department of Health and Human Services Food and Drug Administration Center for Drug Evaluation and Research (CDER) Center for Veterinary Medicine (CVM) Office of Regulatory Affairs (ORA) Pharmaceutical CGMPs, September 2004.
[3] “Pharmaceutical CGMPS for the 21st Century — A Risk-Based Approach”, Final Report, Department of Health and Human Services U.S Food and Drug Administration, September 2004.
[4] Juran, J.M. (1992). Juran on Quality by Design: The New Steps for Planning Quality into Goods and Services. Free Press.
[5] “Guidance for Industry, Q8(R2) Pharmaceutical Development”, U.S. Department of Health and Human Services, Food and Drug Administration, Center for Drug Evaluation and Research (CDER), Center for Biologics Evaluation and Research (CBER), November 2009.
[6] The “C”, or “Control” phase of the Six Sigma process is commonly defined as the phase where any gains obtained from the previous Six Sigma phases (Define, Measure, Analyze, Implement) are “locked in”, thus avoiding loss of benefit from these earlier phases.
[7] Manoharan Ramasamy, Nathan Pixley, Bruce Thompson, Chuck Miller, Louis Obando, John Higgins, Mark Eickhoff, “Realizing Value Through PAT Implementation”, IFPAC 2015.
[8] “Pharma’s slow embrace of continuous manufacturing”, https://www.biopharmadive.com/news/pharmas-slow-embrace-of-continuous-manufacturing/532811/
[9] Gy, P (2004), Chemometrics and Intelligent Laboratory Systems, 74, 61-70.
[10] Kim H. Esbensen, Andrés D. Román-Ospino, Adriluz Sanchez, Rodolfo J. Romañach, Adequacy and verifiability of pharmaceutical mixtures and dose units by variographic analysis (Theory of Sampling) – A call for a regulatory paradigm shift, International Journal of Pharmaceutics 499 (2016) 156–174.
[11] O’Connor, Patrick D. T. (2002), Practical Reliability Engineering (Fourth Ed.), John Wiley & Sons, New York. ISBN 978-0-4708-4462-5.
[12] C.E. Miller, “The Use of Chemometric Techniques in Process Analytical Method Development and Operation”, Chemometrics Intell. Lab Syst., 30(1), pp. 11-22 (1995).
[13] Box, G. E. P. (1976), “Science and statistics” (PDF), Journal of the American Statistical Association, 71 (356): 791–799, doi:10.1080/01621459.1976.10480949.
[14] H. Martens and T. Næs, Multivariate Calibration, Wiley, Chichester, 1989,
[15] C.E. Miller, “Use of Near-Infrared Spectroscopy to Determine the Composition of High-Density/Low-Density Polyethylene Blend Films”, Appl. Spectrosc., 47(2), 222 (1993).
Charles E (Chuck) Miller:

Career-long journey in Process Analytical Technology

Chuck Miller has over 30 years of experience in chemometrics, near-infrared spectroscopy, and Process Analytical Technologies, and applying these technologies to industrial challenges in R&D and manufacturing.

His career spans 13 years at DuPont, in the Process Analytical group and 10 years in the Process Analytical Technologies group within the Manufacturing Division of Merck Sharp and Dohme. In between he worked at Eigenvector Research in consulting, training and software development for 4 years.

Chuck obtained his Ph.D in Analytical Chemistry from the University of Washington, and did his post-doctoral research at the Max-Planck Institute for Polymer Research in Mainz, Germany and The Norwegian Food Research Institute in Ås, Norway.

Follow Chuck here:

Find the right solution for your analytics needs.

Get in touch if you have questions about our products, platform, how to get started or how best to address your analytics needs.

Contact form