Printer Friendly

Data integrity in a nutshell: industry must take bold steps to assure the data used for drug quality decisions is trustworthy.

Regulatory inspectors have started digging much deeper into data, no longer accepting batch release data and supportive testing at face value. Even worse, this effort is justified: they have cited a number of firms for violations of data integrity, (1-4) a most fundamental bond of trust between manufacturers and the regulators that inspect them. Industry must take bold steps to assure the data used for drug quality decisions is trustworthy, from its creation until today. This requires an integrity mindset.


Adopting a data integrity mindset is not difficult, but it requires looking with a different focus. As scientists, we instinctively focus on methods to assure their scientific validity. This is important, but it neglects the product of our efforts--data. At the end of the day, it is all about the data. Neglect data integrity and brilliant science loses its worth: what value do you place on an analytical result that you do not trust?

A data integrity mindset is simple to adopt in your business. Just follow the data: how it is created, modified, combined, calculated, reported and retained--the lifecycle of data (see Chart 1 for an example). Identify people (by roles) who perform each important transaction and particular places where someone could make unapproved changes. Think critically: if there was invalid data, created by "fat fingers" or even fraud, how would you detect it? Assess risks, then allocate resources to reduce high-risk actions to acceptable levels.


Your biggest integrity risk is your manual processes. Why? Because there is no security enforcement and no audit trail. People can create something, record it, and there is no verifiable evidence of falsification. Consider real-time verification of manual, important data observations; even better, automate processes so data generation is not controlled by the user. Automation improves integrity: it can enforce access through security controls, and may be configured to create audit trails that provide accountability for changes in important data.


For automated systems, integrity is about metadata--those fields that prove data was created and managed properly. In other words, it's mostly about your important audit trails. All audit trails are not created equal: a login/logout history is not as important as history of changes to the instrument configuration, or those made to a result value after its creation. You have to understand your system and identify the audit trails that directly or indirectly support data quality. Remember, these records are often not called 'audit trails' in applications. Don't be fooled: look at the function, not the title.


Once you know the important audit trails, you must do everything in your power to make them secure. If someone manipulates the computer's clock providing the timestamp, the history is meaningless. Control the clock, and have a process to keep it accurate at some frequency. You must know the person who performs important actions like result calculations. Shared accounts destroy accountability and must be replaced with single access accounts. Don't neglect human behaviors: people will share passwords, post them on notes, and circumvent secure systems unless they are taught to behave responsibly.


Systems must be validated for your intended use (configuration), so think carefully about your business practices and data when defining requirements --all other validation activities flow from them. What data needs a history file? What events directly impact data quality? Be sure to test those important history records so you know the system works as you intended. Read up on current validation approaches, such as GAMP, (5) to learn from the wisdom--and mistakes--of other practitioners.


Configuration directly impacts data, and is a critical part of validation. Configuration can control where files are stored, who accesses them, and even events that create audit trail records. It can be a steep learning curve, but you must understand the impact of each configuration option and select them carefully. Justify each option in writing. It will

1. allow recovery from disasters

2. permit informed changes, if needed

3. Explain why this configuration setup was chosen, rather than other options.

Incorrect assumptions about configuration can potentially expose files to unauthorized change or deletion.


Retention of electronic raw data is more than securing the result file(s) over their useful life. Those important audit trails supporting data integrity also must be managed like result files--they are equally important! Every move, every transformation puts data quality at risk, so secure important records quickly and move them only when the risk of leaving them exceeds the risk to move.


Raw data must be reviewed prior to release, whether paper or electronic. For electronic systems, leave the paper behind and start looking in the system. Review those important audit trails (the ones you identified during validation) along with test results. Make your review efficient: when purchasing, look for systems that inform you when important audit trails exist (e.g. indicator flags).


Once the data integrity mindset is beginning to show improvements in data quality, it must go beyond a phase in the minds of a few--it must become part of the culture. Every change, every incident, every new process must be scrutinized for its risks to data integrity. New people must be trained to adopt this mindset. Keep the data integrity light burning in your organization: find ways to stimulate discussions about data integrity in team meetings with a recent regulatory citation, a case study, or recent situation in another workgroup.


Based on current inspection practices, industry should now expect regulators to dive into their data and challenge its validity. This should not be cause for alarm, but a reminder that we must be diligent in protecting our "other" product: data. Data integrity assures product quality. Product quality assures patient safety. It starts with data integrity.


(1.) Wockhardt Limited (18-JUL-2013) Warning Letter http://www.fda. gov/ICECI/EnforcementActions/ WarningLetters/2013/ucm361928.htm

(2.) Posh Chemicals Private Limited (02-AUG-2013) Warning Letter http://www. WarningLetters/2013/ucm365427.htm

(3.) RPG Life Sciences Limited (28-MAY-2013) Warning Letter http://www. WarningLetters/2013/ucm355294.htm

(4.) Fresenius Kabi Oncology Ltd (01-JUL2013) Warning Letter http://www. WarningLetters/2013/ucm361553.htm

(5.) GAMP Good Practice Guide: A Risk-Based Approach to GxP-Compliant Laboratory Computerized Systems, 2nd ed. ISPE Publications, 2012

Mark Newton is a Consultant-QA, Global Quality Laboratories at Eli Lilly and Company. He may be reached at
COPYRIGHT 2013 Advantage Business Media
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2013 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:INFORMATICS
Author:Newton, Mark E.
Publication:Scientific Computing
Geographic Code:1USA
Date:Dec 1, 2013
Previous Article:A fresh look at the AnIML data standard: data standards promise easier collaboration, data exchange and interoperability in the informatics world.
Next Article:Text mining: the next data frontier: by some estimates, 80 percent of available information occurs as free-form text.

Terms of use | Privacy policy | Copyright © 2021 Farlex, Inc. | Feedback | For webmasters |