Printer Friendly

Enabling real-time release, process understanding and continuous quality verification: process development and manufacturing intelligence platform leads to achieving PAT and QbD goals.

Although the business case for real-time release is still maturing, it is certainly drawing interest among life sciences manufacturers. It is important to look closely at the road map that leads us to that end--keeping in mind that it could involve a decade-long journey in some cases. The technology that takes us down the right path is a Process Development and Manufacturing Intelligence software platform that enables easy use of the increasingly vast amount of disparate data gathered from the design stage and throughout the product life cycle.


The goal of Process Analytical Technology (PAT) and Quality by Design (QbD) is to obtain sufficient process understanding to build a cause-and-effect model that improves process control and moves us towards Continuous Quality Verification (CQV) and real-time release of an end product without requiring final quality testing. The best way to achieve these goals is to implement systems that enable three important groups to collaborate effectively as a single team starting in Process Development (i.e., the design stage). These groups include: 1) Process Development (PD), 2) Manufacturing (MFG) and 3) Quality Assurance (QA).

The environment created by today's manual data access methods and disconnected analytics (a.k.a. spreadsheet madness) is problematic, error prone, and inconsistent with achieving the goals of PAT and QbD. Self-service data access and contextualization, together with self-service investigational analytics, are required for successful collaboration between the three groups that compose the extended design stage "dream team."


How do we reach the long-term goal of real-time release?

We build a process model that's consistent with the principles of QbD. The FDA's PAT guidance--now more than five years old--included four considerations for an effective PAT strategy:

1) Process understanding

2) Risk-based approach

3) Regulatory strategy to support innovation

4) Real-time release

In the early days, companies thought they had to purchase new PAT instruments. What we often forgot is that PAT measurement devices are a means to an end, based on nondestructive types of measurements. An instrument only creates a stream of numbers; however, to achieve process understanding, you need the information to be understood. So, software systems are required for collection and access to all data types for contextualization and analysis that enables improved process understanding and control on the path towards CQV and real-time release.

The FDA defines QbD as: Doing the science to support process performance characteristics that are designed to meet specific objectives, not merely empirically derived from the performance of test batches. QbD results from:

* A combination of prior knowledge (from our own education, text books, seminars, etc.) and experimental investigation with processes at small scale and full scale for similar drugs and molecules that improves process understanding.

* Relevant time process measurement and control. We talk about real-time, but what is also important is relevant time. A software engineer defines real-time as pressing "enter" on the keyboard and seeing the computer screen change in microseconds. To manufacturing process engineers, relevant time means that they have information available in sufficient time to affect the outcome--drive a good outcome or prevent a bad outcome--which might be several hours or several days depending on process dynamics.

* A sufficient cause-and-effect model that links Critical Process Parameters (CPPs) and Critical Quality Attributes (CQAs).

The operational definition from the FDA is that a process is well understood when 1) all critical sources of variability are identified and explained, 2) variability is managed by the process (not by humans outside of the process), and 3) product quality attributes can be accurately and reliably predicted. Accurate and reliable predictions reflect process understanding, and process understanding is inversely proportional to risk. We're ultimately striving to build a process model for process understanding to reduce patients' risks and our own business risks.

Defining CPPs can be a continuous learning process. We know they are in-process parameters supported by scientific data that show them to be those most responsible for driving variability in the Critical Quality Attributes (CQAs). Using risk-analysis to identify likelihood and severity, we can show that CPPs are strongly correlated to CQAs, and some can work together. Multi-variate analysis techniques become very important in the ongoing learning process to institutionalize and manage the relevant knowledge and learn from it within a company.

Process validation is defined as the collection and evaluation of data, from the process design stage throughout production, which establishes scientific evidence that a process is capable of consistently delivering quality products. We can extract from the FDA's guidance published in late 2008 that process validation and CQV are actually a lifestyle, not an event. You are always validating a process--collecting data to support the fact that it is under control. This leads to a process model based on ongoing experiences from which we continue to learn.


For every process there is a "knowledge space," which includes everything we know that is related to that particular process. A control space is a subset of the knowledge space, but the goal is to get approval for a design space, which is a superset of the control space and a subset of the knowledge space.

When process upsets occur, we can change the control space, ideally automatically, without the need for re-approval of the process, provided we stay within the design space.

Imagine a complex bioprocess that is operating in a state of control until an adverse trend or process upset occurs (e.g., an unexpected raw materials change). The control system needs to access a process model to make corrections that re-establish control of the process within the design space. This is what we mean by Real Time Adaptive Control (RTAC). The right Process Development and Manufacturing Intelligence platform allows us to collaborate to build the process model that the control system "talks to" by defining and understanding the relationships between upstream CPPs and downstream CQAs.

From a user's perspective, in this example, self-service, on-demand availability of data and contextualization that automatically accounts for batch genealogy in upstream downstream correlations is a critical success factor for process understanding to build the necessary process model for RTAC that links CPPs and CQAs. This enables us to cope with processes that have splitting and pooling in the process stream as it progresses from raw materials to final product. This type of contextualization takes into account the proportionality (e.g., 50% from one raw material lot and 50% from another) of each upstream to downstream step to calculate the fractional contributions of upstream inputs to downstream outcomes. Without this capability, we must resort to time consuming and error prone manual methods.

In summary, process understanding is a core element leading to CQV and RTAC that enable real-time release. At the intersection of our experience and education and tools lies the opportunity for better process understanding. To build a process model, we need relevant-time measurements through a Process Development and Manufacturing Intelligence platform that:

* Provides self-service, on-demand data access by PD, MFG, and QA users to all the data (discrete, continuous, replicate, event, keyword, text) in a meaningful context to complete investigations in minutes, not months, and understand relationships between CPPs and CQAs.

* Delivers practical analytics that include descriptive (what happened?) as well as investigational (why did it happen?) capabilities to the PD, MFG, and QA team.

* Includes all types of electronic data, as well as paper-based data, to make meaningful analysis possible.

* Allows non-programmers and non-statisticians to complete tasks quickly and effectively as a collaborative team.

* Accounts for batch genealogy for processes with multiple splits and recombinations.

When manufacturers can easily access and analyze data in the right context for better process understanding, PAT and QbD goals are within reach.

About the Author:

Justin Neway, Ph.D., is founder, vice president and chief science officer at Aegis Analytical Corp., 1380 Forest Park Circle, Suite 200, Lafayette, CO 80026. Tel. 303-625-2102,,

By Justin O. Neway, Ph.D, Aegis Analytical
COPYRIGHT 2010 Advantage Business Media
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2010 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:QUALITY BY DESIGN
Author:Neway, Justin O.
Publication:Pharmaceutical Processing
Date:Apr 1, 2010
Previous Article:Multi-lane feeder has a compact, space-saving design.
Next Article:Sterilization and sealing of aseptic diaphragm valves: as sterility requirements grow, so does the need for valves with improved cleaning features.

Terms of use | Copyright © 2017 Farlex, Inc. | Feedback | For webmasters