Printer Friendly

Computerization: key to a successful QA program.

Computerization: Key to a successful QA program

When the Joint Commission on Accreditation of Healthcare Organizations mandated in 1987 that health care professionals monitor and evaluate the quality and appropriateness of our patient care, (1) the clinical laboratory at our institution, 450-bed El Camino Hospital, was ready.

Six years earlier, shortly after arriving at the hospital, coauthor Elevitch had created the position of quality assurance manager. The QA manager would plan, implement, and maintain a departmental QA program to be integrated hospitalwide. Coauthor Lu has filled that position from the beginning. Through her direction the clinical laboratory became the model for the hospital QA program that was implemented two years later.

Our objective in 1981 was ahead of its time. We wanted to develop quantitative indicators that would help us assess the level of quality in each clinical laboratory section relative to our services to the entire hospital. The computer would provide the communication and information management infrastructure of our quality assurance program. We believed computerization would assure the program's success. It would:

* Standardize procedures, terminology, and data collection forms;

* Simplify clerical functions;

* Conveniently store and analyze large amounts of data; and

* Clarify our reporting formats.

The hospital information system (HIS) was programmed to track incident reports by electronic mail. Our task was to link this HIS function with the hospital case-mix system (CMS) and laboratory information system (LIS) in a microcomputer network. From this simple beginning, our quality assurance programs evolved. As JCAHO requirements became more systematized, microcomputer technology grew increasingly sophisticated. By 1989 we had computerized much of the JCAHO's 10-step QA process (2,3) --just in time for that year's successful inspection. Our JCAHO inspectors admitted they were "astounded" by our system.

* QA steps. The computer assists us in achieving six of the JCAHO's 10 quality assurance steps. The remaining, more subjective steps did not require computerization.

The first step, to assign responsibility for the QA plan, was achieved with the creation of our QA manager position. For steps two and three--define the scope of patient care and identify important aspects of care--we evaluated the most important factors affecting patient care in our own department and worked with the hospital administration and committees from other hospital departments.

Steps four through seven were computerized and will be explained in this article: define indicators, define thresholds for evaluation, collect and organize data, and evaluate data. Step eight, develop a corrective action plan, was done through verbal communication at meetings and by other means as we formulated policy and procedural changes. Step nine, assess actions and document improvement, and step 10, communicate relevant information, were also part of the computer QA plan and will be discussed below.

* Define indicators. The lab's QA manager would work with key individuals from each department section and with committees from administration, nursing, and the medical staff. We delineated the scope of laboratory services as extending from bedside to benchtop to bedside within each departmental domain, including the clinical laboratory, surgical pathology, and cytopathology. Our next task was to define the essential aspects of our services within the classic QA categories of structure, process, and outcome. (4)

Structure. We viewed structure as the availability of resources used for accurate testing. It occurred to us that this indicator category was already addressed in accreditation regulations from the College of American Pathologists, whose inspections we had passed. What if we established quarterly self-assessment questionnaires for items that historically required maximum attention to achieve compliance? Wouldn't the responses help us identify structural problems? Of course.

We set about writing seven computerized self-assessment forms on pertinent structure issues: staff education, safety, equipment, supplies, facilities, policies and procedures, and documentation (Figure I). For this purpose we wrote a database program on DataEase (DataEase International, Trumbull, Conn.), to run on a powerful microcomputer, an IBM AT, with a state-of-the-art VGA (video graphics adapter) color monitor.

Information on two other structure issues--staffing and costs--comes from the laboratory's productivity system and operating statement. After collating all responses in the PC, the QA manager tabulates the data in the appropriate sections of the quarterly and annual departmental QA reports. These in turn are submitted to the hospital QA committee.

Process. This indicator encompasses the accuracy, timeliness, and technical skill entailed in the delivery of diagnostic testing. Quality control programs minimize systematic and random analytical errors. Non-analytical errors, however, still occur and may adversely affect patient care. Before we could hope to curb them, we had to identify them. Using the HIS electronic mail incident reporting system as a foundation, we developed another DataEase program to group incidents into the categories shown in Figure II. The quarterly self-assessment audits included additional forms on service efficiency and patient/user satisfaction.

Outcome. We view the outcome of lab services as the accessibility, utilization, and appropriateness of diagnostic testing at our hospital. This indicator category requires the most work of all--handled largely by the computer--for it is here that the CMS integrates with the LIS in the PC's fourth-generation language, PC Focus (Information Builders, New York, N.Y.). Called Focus for short, the software combines database management, query and reporting, statistical modeling, color graphics, and spreadsheets in a single package. In short, it brings workstation power to the laboratorian's desktop.

The CMS provides patient-specific DRG information on a floppy disk. Physician orders and CAP workload information are downloaded directly into the PC. As a result, we can show the relationship between patient care and work performed at individual laboratory workstations.

The computerized QA program lets us address both high-volume and high-risk indicators. High-volume indicators include the number of tests per patient per DRG and the number of CAP workload units per patient per DRG. High-risk indicators include therapeutic drug monitoring, blood and blood component therapy, coagulation tests for monitoring anticoagulants, use of antimicrobial agents and blood cultures, and emergency testing. Our program enables us to measure activities in each major QA category.

* Define thresholds. Under JCAHO guidelines, two types of measure help define thresholds for evaluation: sentinel event (or severity) indicators and comparative rate indicators. Sentinel event indicators identify rare but serious adverse outcomes such as those that might occur following an incompatible blood transfusion. The acceptable threshold for sentinel events is zero.

Comparative rate indicators measure events that require detailed case review only when the rate exceeds an established threshold or when a significant variance is noted over time. For example, we look for trends in over- or underutilization of laboratory services and in delays or inaccuracies in diagnosis or treatment. Armed with the resulting data, we enlist the help of the appropriate medical committee--pharmacy and therapeutics, transfusion, infection control, or case management, for example--to help define thresholds for the comparative rate indicators.

* Scrutinize data. The structure, process, and outcome categories help us address the next two JCAHO steps: collection and organization of data followed by evaluation of data.

Structure. Each quarter, the QA representatives of each lab area survey their respective sections on the PC with the nine self-assessment forms. Screen prompts make it easy for them to enter the results when the questionnaires have been completed. The percentage of actual compliance in each category is reported as the number of observations of compliance divided by the number of total observations.

Process. Anyone on the health care team who identifies a suspected lab-related erroneous event files an incident report (Figure III). When the report comes up on the screen, the user presses a button to choose one of the problem categories in Figure II. Selecting from a choice of hospital locations and a list of lab areas, the user names the place where the event occurred and the lab section that was affected. The person(s) involved and the reason for the problem are identified during the subsequent investigation, when we note the action taken and follow-up.

Each item of information (data element) entered is coded for future retrieval and analysis. The software program automatically groups incidents by category and tallies them. The hospital QA committee receives an incident tabulation report based on this information each quarter.

We strive to solve problems promptly. Test scheduling errors, breaks in sterile technique, and similar problems are referred to the appropriate nursing or medical staff committee. Responses from the self-assessment forms that address service efficiency and those regarding patient or user satisfaction are entered for inclusion in the quarterly QA reports as well.

Outcome. Once a month, the HIS group pulls a floppy disk containing DRG-specific information through the previous month. The time lag is unavoidable; before we can play with the data, medical records must supply the DRG coding and the information systems group must process the information. Data arrive at the LIS in ASCII, a universal uncoded computer format that is easily translated into the program of one's choice.

Meanwhile, patients' laboratory histories are downloaded from the LIS into our PC. We wrote the program that accomplishes this step in GW-Basic (Microsoft Corp., Redmond, Wash.) and use a proprietary interface supplied by our LIS vendor. Tests are identified by CPT-4 codes in the ASCII files, which are sent to Focus. With all this information available, the scope of analysis is limited only by our imagination.

Among our most useful utilization studies are two that relate DRGs to lab workload. One of these ranks DRGs by number of cases discharged per month. The other records and ranks CAP workload units by DRG.

In the first of these, DRGs are listed in descending order of work generated in the lab that month. The percentage of WLUs done is given as well. For contrast, the rank of that DRG among total cases in the hospital and the percentage of total cases represented by that DRG, also within the month, are given.

The disparities can be remarkable. In one study we did, the lab's top-ranked DRG, vaginal delivery without complications, was only 23rd in the hospital ranking. Neonates with other significant problems, our third-ranked DRG, was 45th in the hospital overall.

The program that analyzes DRGs by percentage of WLUs performed per month also provides interesting contrasts. In one study, the DRG generating the greatest percentage of WLUs, cardiac valve procedure with pump, was 31st among the total cases done in the hospital. The laboratory's second-highest work producer, respiratory system diagnosis with tracheostomy, was 37th in the hospital hierarchy.

Hospital administrators tend to concentrate on which DRGs generate the most money, while we are more concerned about those that create the greatest volume and workload in the lab. Having detailed analyses available from the lab's point of view--unlike most DRG information, which reflects the activities and needs of the entire hospital--gives us leverage in urging administration not to overlook DRGs that are important to our operations but might otherwise be neglected. Hard facts sometimes persuade them to refocus their priorities.

Other utilization studies include tests ordered per DRG, DRGs by physician, and tests ordered per DRG by date of admission. We can also measure the time it takes a physician to respond to a lab report of a critical value and order a follow-up test.

* Improve and communicate. The laboratory's quarterly QA report provides two JCAHO steps: documenting improvement and communicating relevant information to the hospital QA committee. Although we have found it difficult to computerize this report beyond simple word processing, we are working on a solution.

Previously, we manually transcribed the summary findings from the structure questionnaires into the first section of the report. In the process section we noted high-volume test turnaround times calculated by the LIS and incident report abstracts and tallies. Outcome indicators focused on the transfusion service, including autologous and directed donations.

Recently, however, we developed a report to be produced directly from DataEase and Focus. We expect these reports, which will let us take advantage of our software's extensive computational and graphics capabilities, to eliminate transcription steps--and thus transcription errors.

Computers have served our QA efforts well over the past nine years. We have designed and implemented a laboratory quality assurance program that is fully integrated with the hospitalwide QA program in full compliance with JCAHO requirements. The added dimension of using a fourth-generation computer language enables the clinical laboratory to monitor the quality and appropriateness of testing impartially and objectively. With this information we can provide valuable information to the hospital's QA educational program.

Lohr and Schroeder recently reminded us that "successful quality assurance will always have to concern itself with both the processes of care and patient outcomes," yet "newer models of continuous quality improvement emphasize . . . a ceaseless cycle of examination and change." (5) Nothing is better suited to help the laboratorian meet this quality assurance challenge than computerization.

(1) "Monitoring and Evaluation: Pathology and Medical Laboratory Services." Oakbrook Terrace, Ill., Joint Commission on Accreditation of Healthcare Organizations, 1987.

(2) Elevitch, F.R., and Lu, S. A computer-assisted quality assurance program for the hospital clinical laboratory: Pitfalls and their avoidance. Presented at the national meeting of the American Society for Medical Technology, Washington, D.C., October 1989.

(3) Berte, L.M. Growing into laboratory quality assurance. MLO 22(2): 24-29, February 1990.

(4) Donabedian, A. Evaluating the quality of medical care. Milbank Mem. Fund Q. 4(3 part 2): 166-206, 1966.

(5) Lohr, K.N., and Schroeder, S.A. A strategy for quality assurance in Medicare. N. Engl. J. Med. 322: 707-712, 1990.

Dr. Elevitch is director of clinical laboratories at El Camino Hospital in Mountainview, Calif. Lu is quality assurance manager and education coordinator there.
COPYRIGHT 1990 Nelson Publishing
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1990 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:quality assurance
Author:Elevitch, Franklin R.; Lu, Susie
Publication:Medical Laboratory Observer
Date:Oct 1, 1990
Words:2244
Previous Article:HCFA advances CLIA '88 implementation.
Next Article:Dealing with know-it-alls - even if you are one yourself.
Topics:


Related Articles
The question of quality.
On pushing quality assurance upstream.
A model QA program as a management tool.
Is your hospital ready for CQI?
Making the grade on the long road to QI.
How straight is the road to QI?
Moving from quality assurance to continuous quality improvement.
Covering the bases with a new QA program.
Countdown to success: a fresh approach to quality in the laboratory.
Everyday QA: A case study. (Lab Management).

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters