Printer Friendly

A microcomputer network for interlaboratory QC.

A microcomputer network has significantly improved quality assurance by knitting together several widespread laboratories that perform similar tests at our 1,000-bed medical center.

The network enables these labs to quickly report and compare results on patient specimens shared for monitoring purposes. Modems and telephone lines link the microcomputers.

Complete summaries of results can be displayed within two hours of the time specimens went into distribution. This kind of speed is important in laboratories relying on specimen exchanges and comparative data to check instrument calibration--as out hematology labs do. Before we turned to microcomputers, the process of sampling, testing, reporting by phone, manually calculating means and other data, and circulating summary sheets used to take several more hours on the day shift, longer at night.

Any capsule description of our system raises preliminary questions. Why are a number of labs performing similar tests? What makes hematology so dependent on patient specimens for quality control? Let's look at the background.

Our medical center consists of a new 616-bed acute care hospital about one block north of an older facility that contains 353 inpatient beds, a 39-bed eye center, and a substantial multiclinic outpatient service. A large decentralized laboratory organization serves both impatients and outpatients.

For rapid turnaround on Stat requests as well as on some frequently ordered routine tests, three small 24-hour general purpose labs are located close to the patient areas. Each serves several hundred beds.

Large main or core labs, one for each discipline, are located further away. They offer extensive menus of more complex tests, many of which are reported the next day. In addition, there are departmental labs (run by the departments of medicine, surgery, and pediatrics, as examples), research labs, and special clinic labs that perform some specialized procedures.

So the hospital laboratory service is spread out and to some extent duplicative. Six labs perform blood cell counts; five, electrolytes; four, urinalyses; and three, blood gas analyses. If a patient is admitted through a clinic, undergoes surgery, and then spend a few days recovering, the chart will carry lab studies from at least three separate locations. This could conceivably confuse a clinician trying to interpret results.

To monitor the various laboratories and promote data consistency, the system of interlaboratory specimen exchanges was developed. Multiple samples are distributed daily for hematology and blood gas analysis and weekly for chemistry. We will concentrate here on the hematology exchange system and microcomputer network; the others are similar in concept and application.

Granted, commercial control materials are widely used in hematology labs to monitor instrument calibration and performance. A general problem is that manufacturer-assigned values on commercial controls are blindly accepted in many labs without prior analysis of each new lot on a properly functioning "in control" instrument. Values recovered from such an analysis should closely match the manufacturer's published values for the particular instrument used.

A significant variation from insert values suggests either an instrument problem or a product problem. If reference methods are readily available for measuring hemoglobin, red and white blood count, packed cell volume, and platelet count, one can quickly determine whether the instrument or commercial material is at fault.

Two specific problems related to the use of commercial controls became apparent when Ortho ELT-8 multichannel hematology instruments were introduced in our labs in 1979. First, the commercial control material developed for Coulter S electronic impedence instrumentation were not "seen" in the same way by Ortho's laser light scatter counters: The results differed. This was especially evident in hematocrit and mean corpuscular volume values recovered on the same lot of stabilized blood cells.

Figure I, based on the 1981 CAP hematology survey, documents the variability of hematocrit and MCV determinations. Since that time, the variability has decreased, quite possibly due to improvements in the manufacture of stabilized red cell preparations. We should note that the interlaboratory precision of hemoglobin and red cell measurements has continually been very good.

The other specific problem was that many earlier commercial controls developed for the laser instruments did not have data for possible use with electronic impedence instruments. There were no cross correlations between controls for the two systems.

We needed a control material that would work satisfactorily on both the Coulter S and the new Ortho ELT-8. Data from the two instruments had to be comparable since a patient's blood might be tested on either or both during the course of a hospital stay.

Fresh blood specimens filled the bill. We decided to run six different patient specimens on each instrument (three specimens on the day shift and three on the evening), trying to cover the range of expected counts from low abnormal through normal to high abnormal levels. In the specimen exchange, we established acceptable limits around mean values for leukocyte counts, hemoglobins, hematocrits, MCVs, and platelet counts (Figure II), Leukocyte limits were [plus-or-minus] 10 per cent, for example. The CAP participant summary indicates quitre similar interlaboratory variability for these tests, and we have by and large retained our early limits up to the present time.

Our hematology labs also use moving average data and within-day repeat testing of specimens to monitor instrument calibration. All of this spells decreased emphasis on use of commercial controls.

Figure III illustrates the current hematology instrument configuration in the various laboratories taking part in the specimen exchange. Early each morning, the core hematology laboratory distributes aliquots of three randomly selected patient specimens. Because the volume of specimen is limited, we transport the aliquots from one lab to the next.

After testing, results are transmitted by the general purpose labs and some of the others via microcomputer to the core lab; the rest, lacking microcomputers, call in their findings. The microcomputer's fast display of means and variances gives participating labs a real-time comparison. As for labs that are not in the network, they are immediately phoned if any of their results appear to be out of line and told that they may have an instrument problem.

A laboratory with results beyond acceptable limits may choose to remix and reassay the specimens in question or request additional specimens for comparison, Extreme outliers, exceeding predefined parameters, are edited out of the data base because they unduly skew the means.

In addition to the computer display of means and variances, a printed summary sheet of the day shift exchange is available by mid-afternoon. This summary is distributed with the three specimens that are shared by labs operating during the second and third shifts. Conversely, the day shift receives a summary of the previous night's results. In this way, we can check for drifts over a period of time.

Part of our microcomputer network is illustrated on the opening pages of this article. It consists of IBM Personal Computers and Radio Shack Color Computers. The core hematology laboratory has the IBM PC XT. Other labs with microcomputers are connected by phone lines to the XT and can operate it, thanks to the XT's Remote Access program. They thus can not only transmit their data but also review the entries of other labs for each specimen.

The data management software used to build and manipulate the files is dBase II. A separate series of dBase command files, not accessible remotely, lets the coordinator of the interlaboratory specimen exchange system edit the data base, perform appropriate calculations, and print daily and monthly summaries.

The dBase management system is very user-friendly. None of us has had much programming experience, yet we were able to construct all the necessary command and data files.

The fact that many of our laboratories performed the same tests, because we are not centralized, originally posed a major problem, making it harder to attain overall good quality in laboratory testing. With the help of personal computers, and commercially available software, we turned this problem into an asset: outstanding quality assurance, on a real-time basis.
COPYRIGHT 1984 Nelson Publishing
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1984 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Stewart, Charles E.; Dotson, Mary Ann; Koepke, John A.
Publication:Medical Laboratory Observer
Date:May 1, 1984
Words:1308
Previous Article:Presenting case studies on a microcomputer.
Next Article:The microcomputer as management tool.
Topics:


Related Articles
An on-line cost analysis system.
Microcomputers in the lab: the sudden boom.
Putting the magic to work.
Trends in microcomputer-based lab systems.
A look at packaged microcomputer systems.
Quicker QC on a small microcomputer.
Quality control in the new environment: statistics.
Proficiency testing: error prevention and correction.
QC assigned values from peer data.

Terms of use | Copyright © 2017 Farlex, Inc. | Feedback | For webmasters