Printer Friendly

A QA plan to monitor charted lab results.

At a roundtable discussion among administrators from various laboratories in our region last year, most participants were surprised to learn that we had established a review process to assure the quality of charted results. We feel strongly that output control should be of concern to all lab areas that generate computerized information.

The JCAHO requires laboratories to evaluate the appropriateness of their services and to correct problems identified. We have interpreted this rule to include the review of the charted report made available to the clinician. quality control of laboratory tests is only part of a broader scheme of quality assurance that the clinical laboratory is expected to perform. In this article we will share the method we devised to monitor the quality of charted results issued by our lab.

Our clinical hematology department is located within a large medical center affiliated with the University of Cincinnati. The monitor we instituted was designed to review the technical aspects of reports independent of other monitoring activities within the hospital, such as utilization review, physician committees, and medical records. All results generated by our department are entered into the computer either with computer-interfaced instrumentation or manually. Major aspects of the monitor, therefore, are concerned with the mechanics of data handling in the laboratory computer.

* The report. We are fotunate to have a staff dedicated solely to the maintenance of our computer. A representative from the technical side of the lab section works with these professionals to test all newly implemented and newly interfaced equipment before it is put to use.

For all tests that will be maintained in computer files, formats must be established for answer types (whether to be reported in digital form, in preestablished phrases, or in free text), placement in the report of reference intervals,
 Figure 1
 Test reports chosen for one-month QA review
 Number of reports Hematology Charting
Type of test Studied area(1) method(2)
CBC 28 H I
APTT 16 C/H I
PT 15 C/H I
Differential 14 H M
Fibrinogen 6 C/H M
Reticulocyte count 4 H M
Body fluid cell count
and differential 3 SH M
ESR 1 H M
FDP 1 C/H M
Thrombin time 1 C/H M
Hemoglobin
electrophoresis 1 SH M
Serum iron 1 SH M
Fecal leukocyte 1 SH M
Malarial smear 1 SH M
Myoglobin 1 SH M
Total 94
 (1) H: Hematology; C/H: coagulation/hemostasis; special
hematology
 (2) I: instrument inerface; M: manual entry


test life (length of time the test will remain on line, permitting new entries), and many other complex determinations. After programming the test in the computer, we review a sample copy of the report to make sure it will be clear.

Ease of use by physicians and nurses as well as by laboratorians calls for consistency in the design of report formats. The information provided should include no less than the standard JCAHO requirements: patient demographics, date and time of specimen collection, test results, reference intervals, interpretive comments, time the test was completed, and identification of the laboratory performing the testing. A uniform format encompassing terminology, abbreviations, symbols, and units as well as constant sequencing of test results facilitates review at a glance.

* Quality control. Internal mechanisms are in place that help capture erroneous information before it can reach the chart. For example, once a day the laboratory computer center generates a list of all defined abnormals. By reading this list we can review all abnormal results for a prticular test and evaluate them as a whole.

Before a test result is verified, it is compared with the test result most recently entered for that patient. Although this check takes time, we anticipate that comparison review will eliminate most sources of error. Examples include specimen collected from the wrong patient, specimen unsatisfactory, and instrument malfunctions. In addition, all quality control records and log books of amanual calculations are reviewed carefully.

* Process. Initially, we established a simple monthly monitor (Figure I) based on all the following criteria: accuracy of manually entered data, appropriate appending of comments (Figure II), legibility, and error correction reports. We review charted results from various sources. Some of the testing we perform for outside clients involves a great deal of free text reporting, an area that we have found to require the most diligent review.

When an error is discovered, a correct result is called in to the care giver. A laboratory error report is then filed that explains the nature of the error, documents the call to the care giver, appends the appropriate comments to the incorrect results, and requests an overstrike of the incorrect results to be made. Once the results have been verified, they cannot be delected or changed, since the clinician may have acted upon them.

Lab error reports may also be required after verification of completed test results. We may discover problems related to the specimen, for example, such as clots found after verification, or receive a call informing us that the specimen was obtaine dfrom the wrong patient. After inserting the appropriate comments in the file, we make sure all steps of the correction process are completed. Comments that we routinely review include tube number used for the cell count in body fluids, specimen source, recheck comments (per established protocol), reference to manual differential (the automated diff is not reportable), critical value called, free text answers, and comments referring to corrections.

Whenever we found deficiencies in the appending of comments, we were giving reminders and in-services to the staff. After some research, however, we discovered that a number of variables could account for the absence of comments in the appropriate location on the charted results. One example is the placing of comments on a battery member (component) of a test. We found that with our system, if more than one comment had to be appended to a specific battery member, such as "critical value called" and "WBC corrected for n-RBCs," the second comment was overriding the first. To make sure both comments appeared, we instructed the staff of append the comment about critical value to the test header (in this case, CBC) and to append the comment about WBC to the WBC battery member.

Our latest finding concerns comments shared by two or more tests. In one case we reported a reticulocyte count and a hemoglobin electrophoresis interpretation for the same patient. We found that the comment "suggest further studies" appeared not only with the electrophoresis results, where it belonged, but also with the reticulocyte value.

Our computer system maintains an enormous database. We attribute our finding only rare quirks to the dedication of our computer staff and to the clear guidelines we established. Even rare problems must be identified.

* Continued review. We are pleased that the quality of charted test results generated by our hematology section reflect well on the professional staff. We began with a monitor intended to help us review charted reports for legibilty and accuracy. We reviewed a wide menu of the various tests done in our secton and believe our sampling was sufficient to make data analysis meaningful. We have been able to demonstrate that the process we instituted to correct errors is fairly efficient and workable.

With he information gathered, we divided the monitor into components (Figure III). Review of aspects that are found to be acceptable will be reduced to quarterly or semiannually. Aspects that fail to meet our criteria will be reviewed as often as necessary until they succeed consistently. We are optimistic in aspiring to establish more involved QA monitors as computer-generated data increase.
COPYRIGHT 1992 Nelson Publishing
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1992 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:quality assurance; routine evaluation of laboratory computer reports to correct errors and enhance accuracy
Author:Comer, Kathleen; Shaw, Lisa; Perrotta, Gerardo
Publication:Medical Laboratory Observer
Date:Mar 1, 1992
Words:1264
Previous Article:How straight is the road to QI?
Next Article:Sound strategies for controlling anger.
Topics:


Related Articles
Quicker QC on a small microcomputer.
Closing the loop on quality.
Computerization: key to a successful QA program.
A model QA program as a management tool.
Making the grade on the long road to QI.
How straight is the road to QI?
Covering the bases with a new QA program.
QA initiatives for the DNA lab.
Everyday QA: A case study. (Lab Management).
Use some horse sense with QC.

Terms of use | Copyright © 2017 Farlex, Inc. | Feedback | For webmasters