Quality control in the new environment.
The reliability of laboratory information has increased dramatically in the last three decades. Manufacturers of instruments and reagents have improved the precision and accuracy of their analytic systems; there are now programs that accredit and license most conventional laboratories; and laboratory professionals have developed quality assurance guidelines for preventive maintenance, record keeping, proficiency testing, and day-to-day quality control.
All of these efforts have markedly improved the precision and accuracy of information produced by contemporary laboratories. It is difficult and perhaps unnecessary to dissect the contribution of each component to the overall outcome. In the aggregate, however, these improvements required, and continue to require, a commitment of considerable financial resources.
A significant portion of most laboratories' operating budgets is committed to assuring the reliability of reported results. Laboratory professionals have accepted the importance of these quality assurance programs and continue to invest in these activities. At the same time, hospital and laboratory administrators are asking us to look for opportunities to reduce our operating costs, especially those activities that do not generate revenue. They commonly ask: "How much quality assurance is enough?'
Today's analytic systems are, in general, inherently more precise and accurate than previous technology, with more stable electronics, better optical systems, better specimen and reagent metering systems, and improved computer control of the procedures. Reagent systems have also evolved and provide much more accurate and precise information than was previously available.
In spite of these improvements, we know there are still "good' laboratories and "poor' laboratories. There are also some analytic systems with an increased tendency to produce unreliable results.
How can we minimize quality assurance expenditures by "good' laboratories and still give them early warning of impending problems before patient care decisions are compromised? How can we develop a system that encourages poorly performing labs to know when they have problems and need to ask for help? Can we identify particular analytic systems or groups of laboratories that are more prone to error, requiring an increased investment to assure the reliability of their results? Today's health system managers face the challenge of reconsidering all expenditures. They must try to realize economic savings without jeopardizing the quality of patient care.
This series of articles will discuss quality control protocols and procedures, and their relevance in today's cost-conscious health care environment. The various authors will be laboratorians who have extensive experience with quality control protocols, both as innovators and as practitioners.
We begin the series with the following article on quality control testing in the chemistry laboratory. Other articles will deal with specific applications in hematology, microbiology, ligand assay, and therapeutic drug monitoring. Assuring the reliability of testing performed close to the patient in hospitals and physicians' offices will also be covered. A discussion of quality control materials and statistics will provide a contemporary update of these fundamental QC components. The last article in the series will pull together threads from the other articles and consider how the management of quality must evolve as clinical laboratory technology changes.
We hope this series will help you rethink your approach to assuring the reliability of test results used in patient care while minimizing the resources required to achieve that level of reliability.
|Printer friendly Cite/link Email Feedback|
|Author:||Belsey, Richard E.; Baer, Daniel M.|
|Publication:||Medical Laboratory Observer|
|Date:||Sep 1, 1986|
|Previous Article:||Congress prepares to strengthen fraud and abuse sanctions.|
|Next Article:||Clinical chemistry.|