Camo blog

Sharing our passion for industrial analytics

Insights, inspiration and reflections from Camosapiens.

Charles E (Chuck) Miller:

A historical introduction to Process Analytical Technology (PAT)

In a series of blogs, I will provide a high-level retrospective look at Process Analytics, and the wide array of value propositions that it provides for various manufacturing industries. I will offer some experiences from various on-line applications that span several industries and in some cases have been in service for several decades. It is safe to say that some of these value propositions were expected, while several of them were unexpected. And finally, realizing that Process Analytics is not without cost or risk, these same experiences will be used to offer some “Best Practices” for Process Analytics – covering seven distinct categories.

Introduction

Process Analytics can be defined as “Rapid, or highly-relevant analytical measurements, associated with an industrial process”. For those in the life sciences and pharmaceutical industries, the term Process Analytical Technology (PAT) was formally launched with the publication of US FDA’s PAT Guidance in 2004 [1]. For those in other industries, Process Analytical Chemistry was already a well-established field, arguably with roots going back to the 1940s and earlier. For convenience, I will use the term “Process Analytics” and the acronym “PAT” to describe this field across all manufacturing industries.

Over the past few decades, the development of key process analytical enabling technologies, including computing hardware, multivariate analysis, and process automation, provided more and more opportunities for value-adding PAT applications across a wide range of industrial verticals. Perhaps not surprisingly, the evolutionary paths of practical PAT application have been distinctly different for the different industries that used it. Even so, today it is clear that all industries have made key contributions to process analytics, thus positioning it as a very competitive tool for realizing continuous improved, process control, and quality assurance across all stages of a product’s lifecycle.

Background (Process Analytics Scope)

Process Analytics might mean different things to different people. As a result, here I attempt to define the scope of process analytics, for the purpose of this review. This will be done through operational definitions of the two words:

Process:

  • In Scope: Any Industrial Manufacturing Process includes:
    • Manufacturing of chemicals, materials, food and consumer products,
    • Agricultural processes,
    • Parts manufacturing and assembly operations,
    • Material handling and packaging operations
  • Out of Scope: Processes that are not manufacturing in nature: including Transactional, Bureaucratic, and Financial processes

Analytics:

  • In Scope: Rapid or near-real-time analytical chemistry or data analysis applied to the process includes:
    • Analytical measurements that are “close” to the process; these can include offline methods used to assess process behavior or in-process material condition
    • Real-time or automated Multivariate analysis (MVA) of process data.
  • Out of Scope: Off-line laboratory analytical chemistry, often done in QA or research Laboratories

Although this scope might at first appear to be too wide, most practical process analytics applications today fit neatly into one of only two categories: 1) on-line/at-line process measurements using analytical instruments, or 2) a rapid data analytics comparison of real-time process data to a reference state, aka “Process Chemometrics” [2,3].

History

Considering the scope defined above, it can be argued that the first process analyzers were developed and implemented in the 1930s and 1940s, when the precursor to non-dispersive infrared (NDIR) analyzers were used to analyze hazardous gas mixtures inline for the booming chemical manufacturing industry.

One of the more compelling stories during this time involves the infrared filter instrument of Lehrer and Luft, named the Ultrarotabsorptionsschreiber, or “URAS”, which was used to support chemicals manufacturing in Germany before and during World War II [4,5]. Unlike the spectrometers of the time period, which were deemed too complex for field use, the URAS was a relatively simple option that could be customized for specific gas detection objectives. The URAS utilized the concept of “negative filtering”, in which a chamber of pure target gas is analyzed in parallel with the process gas. I like to use this instrument as an early example of PAT because it used existing technology to accomplish two process analytical attributes that would be found to be critical decades later: 1) automatic signal referencing, to correct for instrument drift, and 2) interference rejection – in this case using optics rather than math. Several hundred of these instruments were put into service at various manufacturing operations at BASF and IG Farben [6], and these were likely critical for maintaining process control and process safety.

As a historical note, it must be said that any technical accomplishments associated with the URAS are very much overshadowed by the likelihood that some of the products they supported were used for destructive and ethically reprehensible purposes during the war. Although subsequently-developed technologies built from these early systems have been used to support more ethical endeavors, this reflection reminds us of the persistent importance of ethics in science and engineering.

In subsequent post-war decades, various process sensor and analyzer technologies were developed, mainly supporting the chemical, materials and defense industries. Like the URAS system, most of these process analyzer technologies were developed within the end-user company’s organization. The very challenging environments for these applications led to stringent system ruggedness and reliability requirements, which ultimately limited the possible physics, optics and engineering of these PAT systems. In response to these demands, improvements on the NDIR technology, as well as newer solid-state filter and dispersive systems, were being developed and implemented. A good example of this is the DuPont Model 400 instrument, which was commercialized in 1962, with over 5000 instruments being manufactured [7]. Although by today’s standards, this instrument used a very simple method, namely a spectral intensity ratio based on IR absorbance measurements using two fixed filters, the ruggedness of this system’s sampling interface and electronics, and the highly dedicated and experienced support staff, led to its extreme popularity for a wide range of process analytics applications.

The 70s and 80s saw two distinct but equally important trends in PAT:

  1. Further ruggedization of analytical technology for in-line process environments, and
  2. Development of at-line near-infrared (NIR) diffuse reflectance technology in the Food and Agriculture industries.

Regarding the former, ruggedized versions of the analytically powerful FTIR and dispersive spectrometer modalities were starting to appear in both proprietary and public applications, further pushing the PAT envelope. In the public sphere, enabling technologies including diode-arrays, optical fibers, the HeNe laser, semiconductor detectors, and various dispersive modalities, helped add some variety and power to the PAT repertoire. The latter trend, NIR diffuse reflectance analysis, is important for several reasons: a) it demonstrated the efficacy of a very convenient process sampling modality for bulk solid materials – namely diffuse reflectance spectroscopy, and b) it introduced the concept of using empirical modeling math to impart analytical specificity from very non-specific NIR diffuse reflectance spectra. A nice survey of the technology of this period [8] also pointed out the critical importance of reliable and representative sampling in these early NIR applications.

The 1990s onward were a transformational period for PAT, due to three key developments:

  1. the improved availability and capability of personal computing,
  2. the continued development and application of multivariate analysis math (aka Chemometrics) to support method specificity, method transfer, and other key method lifecycle workflows, and
  3. the business strategy of using collaborative cross-company agreements for PAT technology development.

Regarding 1) and 2), it is safe to say that chemometrics combined with more economical and accessible computing platforms opened the way for many complex PAT applications that would not have been possible otherwise. Ironically, however, the application of chemometrics also exposed several potential weaknesses in the PAT technology of the day, including long-term instability of intensity and wavelength calibrations of some spectrometers, and retention time axis instability of chromatographic systems in general.

Development 3), was driven partly by changing business strategies of most end-user companies at that time, where increased focus on core competencies and direct company offerings led to reduced internal resources and support for analyzer system development. Some good examples of such collaborative PAT development efforts were those that several energy companies had conducted with PAT instrument and sampling hardware suppliers in the late 1980s and early 1990s, with the goal of commercializing specialized NIR analyzer systems for online measurement of octane number in gasoline. In one specific example, the “PIONIR1024” instrument was commercialized by Perkin-Elmer in 1992 after several years of collaborative development with Amoco (later BP), [9,10]. A key differentiating element of this system is the presence of internal spectroscopic standards and highly-automated workflows for both intensity and wavelength calibration. Although the general concept of internal instrument standards was certainly not new to PAT at that time (see the “URAS” above), these systems involved a substantial extension of this concept- which greatly simplified and improved the instrument standardization workflow, resulting in improved long-term system performance.

During this same time period, a separate type of PAT emerged when developments 1) and 2) above, were combined with the continued development of process automation systems: namely “Process Chemometrics” or “Multivariate Statistical Process Control (MSPC)” [2,3]. This type of PAT takes advantage of the presence of existing process measurement systems, which can include simple sensors like temperature, pressure, level, and flow sensors, and more novel outputs from the newer process analyzers of the time,- such as composition, density, and particle size. For this type of PAT, a multivariate model is developed using historical data from the process, that represents normal operation of the process. This model is then applied in real-time to “live” data of the process while it is in operation, to enable immediate detection and diagnosis of new or impending process deviations. For legacy manufacturing systems, this type of PAT was quite appealing because it could be used with existing process measurements, without requiring additional capital expenditure and maintenance costs associated with a new process analyzer system. However, it still had some key requirements: including the presence of a sufficient IT and automation infrastructure supporting the manufacturing site, and sufficient maintenance and reliability of the sensors that are used to generate the data.

This concludes my historical PAT overview. Next, I’ll describe how PAT sees use in Pharma. Join me then! And if the history of PAT has inspired you to find out where the future of PAT is headed, look to Camo Analytics whose solutions enable exacting PAT with production process and quality monitoring at a level normally reserved for R&D.

[1] PHARMACEUTICAL CGMPS FOR THE 21ST CENTURY — A RISK-BASED APPROACH FINAL REPORT, Department of Health and Human Services U.S Food and Drug Administration, September 2004
[2] Michael J. Piovoso, Karlene A. Kosanovich, and James P. Yuk, Process Data Chemometrics, IEEE Transactions on Instrumentation and Measurement. VOL 41. NO. 2. April 1992.
[3] Nomikos, P.; MacGregor, J. F. Multivariate SPC Charts for Monitoring Batch Processes. Technometrics 1995, 37 (1), 41–59.
[4] Lehrer. G and Luft K, German Patent 730,478 application March 9, 1938; issued December 14, 1942.
[5] K.F. Luft, Zeitschrift fur Technische Physik 24, 97 (1943).
[6] Worthington, B. “60 years of continuous improvement in NDIR gas analyzers”. In Proceedings of the 50th Annual ISA Analysis Division Symposium—50 Years of Analytical Solutions, Houston, TX, USA, 10–14 April 2005; pp. 95–107.
[7] Dr. John C. Steichen- private communication, April 2014.
[8] Phil Williams and Karl Norris, “Near Infrared Technology in the Agricultural and Food Industries”, American Association of Cereal Chemists, 1987.
[9] Larry McDermott- private communication, April 2014.
[10] web site: http://www.aitanalyzers.com/process-analyzer-product.php?id=10
Charles E (Chuck) Miller:

Career-long journey in Process Analytical Technology

Chuck Miller has over 30 years of experience in chemometrics, near-infrared spectroscopy, and Process Analytical Technologies, and applying these technologies to industrial challenges in R&D and manufacturing.

His career spans 13 years at DuPont, in the Process Analytical group and 10 years in the Process Analytical Technologies group within the Manufacturing Division of Merck Sharp and Dohme. In between he worked at Eigenvector Research in consulting, training and software development for 4 years.

Chuck obtained his Ph.D in Analytical Chemistry from the University of Washington, and did his post-doctoral research at the Max-Planck Institute for Polymer Research in Mainz, Germany and The Norwegian Food Research Institute in Ås, Norway.

Follow Chuck here:

Find the right solution for your analytics needs.

Get in touch if you have questions about our products, platform, how to get started or how best to address your analytics needs.

Contact form