Printer Friendly

Aligning data to achieve process understanding: to truly understand and predict how your process will operate ALL available data must be acquired and analyzed.

Life Sciences companies are taking a variety of approaches to their PAT solutions and to how they acquire and analyze their data. Some companies are relying heavily on spectral data from instruments derived from a manufacturing process like drying or blending. Others are looking toward traditional process variables and using simpler calculations. Still others are trying innovative analysis techniques. Whatever the approach, all of them have the same goal in mind. That is to arrive at a deeper understanding of their manufacturing processes and to achieve the ability to accurately and reliably predict the quality attributes of the finished product.

Much of the focus of PAT is rightly centered on the manufacturing process itself and its relevant data. But in order to have a truly holistic view of the entire process and be able to make accurate predictions about the quality attributes of the end product, other data beyond manufacturing data needs to be gathered and analyzed.

A secondary pharmaceutical manufacturing plant has a variety of systems that contain valuable information about the finished product. In order to ultimately have a complete and accurate view of the entire manufacturing process from start to finish, it's important to have access to all of that information and to construct prediction models from it.

Product raw material data exists in the laboratory information management (LIMS) system. Process variable information is resident in supervisory control and data acquisition (SCADA) systems and plant historians. Alarm and event data resides in relational databases (RDBs). Manufacturing execution systems (MES) compile data for the electronic batch record. Corrective action and preventative action (CAPA) systems own exception and deviation information.

When attempting to analyze all of the disparate data in plant systems and to begin building prediction models that span the entire process, a challenge arises. Each of the enterprise systems handles and stores its data uniquely. That is, the frequency, format, and location of the data are unique and can present difficulty in getting a holistic view of the process. This challenge is not new and not unique to PAT. However, the PAT initiative is driving immense change in the industry and the goal of real time product release requires that a significant amount of data from multiple systems be analyzed.

An example is in the frequency of the data being stored. A SCADA system may send process data to a plant historian every second. This means that variables like temperature, airflow, and pressure data are stored every second in the historian. The MES system may be getting alarm information about the process every few minutes and storing it in a relational database while the CAPA system is managing deviations hourly, also storing data in a relational database.

When all of this data gets put into tabular format, it's easy to see that significant gaps exist. This can present difficulty when trying to assemble all of the data together into a complete view of the process from beginning to end. However, systems are beginning to emerge that can manage all of this data and help make sense of it all.

Meeting The Challenges

Organizations have responded to the PAT data challenges faced by the Life Sciences industry by investing in manufacturing middleware solutions. A comprehensive manufacturing middleware solution built on standards like XML, S88 and S95 is a viable approach to access, align and analyze the data that exists in disparate enterprise systems. These solutions allow Life Sciences manufacturers to leverage their diverse installed base, and turn existing data into process intelligence, and allowing standard pharmaceutical manufacturing process and equipment components to be developed once and reused.

The manufacturing middleware approach acts as glue for manufacturing and enterprise applications allowing for unique access and correlation of data. Standards-based data models provide context to otherwise meaningless and unrelated data. The manufacturing middleware approach provides an integrated environment combining modeling tools for design and analysis, process analyzer interface, process control and optimization, robust storage capability, workflow management, batch execution, operator work instructions and knowledge management tools.

Critical to the success of any PAT deployment is the requirement for access to enterprise data of all types in association with their context, and the analysis tools that transform this data into valuable information used to make decisions regarding product and process quality. A critical element of this approach is a standards-based, hardware and software independent, configurable, common PAT software platform.

Functionality required for PAT will be run as application services integrated with manufacturing middleware, including Data Model and User Interface integration, providing a powerful solution that can be tightly integrated with product, process and regulatory data while providing the framework necessary for continuous improvement.

An example could be the manufacturing middleware application accessing data from several enterprise systems. The data resident in each system remains in that native system while a standard medium like OPC or XML is used to present the data. An off-line modelling tool is used for data analysis activities ranging from simple qualitative analysis to multivariate spectral calibration to modelling of batch unit operation to mapping of complete production lines. The modelling tool is used to build the calibration model that is run on-line and used for real time prediction. The modelling tool can also be used to develop the model to monitor the performance of the batch for a single unit operation or a complete equipment train. A PAT executive application interfaces and sequences all the other capabilities in the PAT solution including closed-loop feedback control, and provides configuration and connectivity to multiple vendor instruments. Both plant historian and RDB capability are provided for storage of the new predicted values and an S95 data model provides unique context to both the raw data and the predicted data.

By Bart hitter, Life Sciences

Management

GE Fanuc

Automation
COPYRIGHT 2007 Advantage Business Media
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2007 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:Pharma It
Author:Reitter, Bart
Publication:Pharmaceutical Processing
Date:Sep 1, 2007
Words:959
Previous Article:Mining for value, but where? R&D is the next area to come under scrutiny.
Next Article:Linear positioners offer flexible performance.

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters