Printer Friendly

kAn in-house proficiency survey for WBC differentials.

Two years ago, our hematology laboratory added a new dimension to its quality control protocol for white blood cell differentials: a monthly in-house proficiency survey. Besides providing an indication of the lab's accuracy and precision in differential counts, the survey enabled technologists to compare their skills against those of co-workers. This either bolstered an individual's confidence or signaled deficiencies.

As the many laboratorians still performing manual differentials know all too well, quality control for this test remains elusive. According to one expert observer, quality control for differentials might consist of reviewing reports for obvious errors or noting differences from previous reports.

To maintain good quality control of differential results, she also recommends close attention to the following points: 1) preservation of cellular morphology--for example, suing an appropriate anticoagulant and observing specimen stability limits, 2) the quality of staining, 3) distribution of white cells on the smear and how accurately they represent the patient's circulating white cells, 4) the area of the smear selected for differentiating cells, 5) the number of cells differentiated, 6) the examiner's ability to correctly identify cells, and 7) the examiner's attitude--whether bored, for example, or motivated to do a good job.

In our 350-bed teaching hospital, we have also emphasized thorough written procedures, careful training of new staff members, and adequate continuing education as additional steps toward standardizing differentials. All of the above suggestions are highly important, yet they tell us very little about end results; i.e., the overall accuracy and precision of our differential tests.

That's the general reason we developed our in-house differential proficiency survey. There was a specific problem we wanted to correct, too. Clinicians had been complaining about the laboratory's policy of holding difficult differentials performed at night or on weekends (those with unusual or abnormal results) for my review or day staff review. The answer was to begin issuing preliminary reports. For that purpose, we wanted to be very sure that evening, night, and weekend technologists could interpret the slides with a sufficient degree of accuracy.

The original proficiency survey consisted of a single slide presented to all technologists on all shifts. This approach had a flaw that should have been obvious at the outset. It is virtually impossible for a survey slide to remain "unknown" as it passes among 35 staff members. We discovered that when several technologists submitted results with essentially the same wording.

So we added more slides. This increased the program's complexity but benefited participants and produced a bonus of more teaching material for the technologists who instruct new staff members on differentials.

Technologists still know they are working with quality control slides. It wasn't feasible to implement a totally blind survey by slipping bogus patient specimens into the routine workload. Because many technologists rotate throughout the lab, this type of proficiency survey wouldn't reach the entire staff.

Nevertheless, we have tried to keep the procedure for handling proficiency slides as close as possible to that for performing differentials on actual patient specimens. Here's what we do.

Slides from four different patients are collected for each survey, in the following manner. Technologists rotate through the differential work station on a daily basis; once a week, a technologist at the station chooses a patient specimen at random (a normal one week, an abnormal the next) and prepares and stains 10 slides from that specimen. The stained slides are then submitted to the hematology chief technologist. At the end of four weeks, we have enough slides for a monthly survey. Since the slides are held for several months, technologists tested for proficiency don't recognize the ones they prepared.

Individual slides are not identified in any way, but the slide box is labeled with the patient's accession number. This allows us to retrieve the original differential test results and pull a copy for the proficiency file.

Once a month, each technologist performing differentials receives a randomly assigned proficiency slide. Seven to 10 technologists perform differentials on separate but similar slides from the same patient and should have comparable findings on their individual survey reports.

A proficiency differential is requisitioned into the computer for each technologist. This allows technologists to enter survey results as they do patient results.

The survey coordinator, a bench technologist who also participates in the monthly proficiency testing, handles the requisitioning process. (It takes her about one hour each month.) The computer then generates labels containing an accession number and the technologist's name, and proficiency slides are assigned randomly. One label is wrapped around the slide and another pasted onto a participation record on which we note whether the slide came from Patient 1, 2, 3, or 4.

The proficiency slides are distributed to the day, evening, and night technologists by their respective chief technologists. The technologists are asked to complete the survey within one week. With the exception of prolonged leaves of absence, survey results from those technologists out of the lab during the entire review week are due in one week after they return to work.

Technologists input proficiency results just as they would actual patient data and then generate a completed worksheet on the computer. This worksheet on the computer. This worksheet containing proficiency results is routed to the hematology chief technologist by a technologist on the quality control run who reviews all hematology results.

A computerized "overdue test log" shows all laboratory accession numbers still pending, and also highlights the proficiency tests. The log is checked periodically, and delinquent technologists receive a reminder via the computer's mailbox function.

As the survey reports come in, the hematology chief technologist updates each staff member's participation record. She also codes each report with a number (1 to 4) to indicate which of the groups of slides was processed by a particular technologist.

The survey coordinator prepares a differential proficiency group report for each of the four collections of quality control slides (Figure I). She records the results from the seven to 10 technologists and the average results for the group. Also noted are the differential results that were actually reported for the patient.

Obvious outlying results are excluded if the hematology chief technologist confirms them to be erroneous. But if we find that the technologists are getting results all over the board on a particular series of patient slides, we might question the quality of slide preparation.

It takes the coordinator about one to two hours to prepare the four monthly group reports. We hope to computerize the tabulations in the future.

Each participating technologist receives a copy of the appropriate group report, with his or her results marked by an asterisk. The technologists are not identified on this report, but rather are randomly assigned a sequential number--for example, Tech 1 or Tech 2. To verify what was entered into the group report, the technologists also receive the computer report of their individual results.

The technologists are expected to assess their proficiency by comparing their findings with the range of results reported by their peers as a precision check and with the group report as an accuracy check. In our program, the group report serves as the standard (reference values) for judging accuracy.

Comparing the group report to the patient results actually issued months earlier also offers an overall assessment of laboratory accuracy, since the technologist who reported the patient results is variable. In this respect, any slide marked for later use in the proficiency survey serves as a blind quality control sample at the time the differential is first performed.

After the proficiency program had been under way for several months, a questionnaire sought staff suggestions for improvements. Among the recommendations we decided to adopt:

* Distribute CBC results with the slides. Including these readily available results makes the exercise more like an actual testing situation. Technologists normally have access to such supportive data, and the information sometimes help them make judgments.

* Provide a brief patient history. Because the patient history can sometimes provide too big a diagnostic clue, it's better to distribute this information after the survey, with the group report. Technologists don't normally have access to a patient history during testing, but it does contribute an interesting follow-up, especially with abnormal or unusual differentials.

* Extend the deadline for survey results. Though one week seemed like a reasonable time limit--the CAP allows five working days on its surveys--the evening and night technologists objected because their rotational schedules might not cover hematology during the survey week. We extended the deadline to two weeks from slide distribution. That tends to make one survey overlap with the next, but technologists are free to turn in their results before the deadline.

* Have supervisors assess performance. Performance on differentials is difficult to assess, and less experienced technologists need guidance and reinforcement. A brief note that their findings fall within the confidence intervals works wonders. We feel that it is most important that technologists be able to recognize the presence of abnormal cells. With time and practice, they will eventually learn to identify the various abnormalities. In the meantime, someone else can confirm their findings.

The fact that most questionnaire respondents preferred not to evaluate their own performance on the basis of the group reports surprised us. Perhaps this hesitancy is due to a lack of definite guidelines for personal assessment. If that's the case, it may be helpful to use Rumke's 95 per cent confidence limits for differentials to help rate individual performance.

However, it is importnt to clearly understand the limitations of Rumke's figures. These numbers are based on statistical calculations that take into account only the random distribution of cells. They do not take into account such factors as uneven cell distribution, incorrect identification of cells, and lack of agreement on the criteria for classifying cells.

Although our proficiency survey is still a far cry from the standards and commercial controls available for many laboratory tests, we think we've taken a big step toward addressing some of hematologyhs biggest concerns:

Are bands and polymorphonuclears differentiated correctly and consistently?

Are blasts, including those present in low numbers, detected?

Are atypical lymphs detected, but not overcalled?

Are immature granulocytes identified correctly?

Are nucleated red cells detected and tallied?

What kind of reproducibility can we expect on differentials performed in our lab?

In our laboratory, which serves a significant percentage of hematologically abnormal patients and where technologists issue all differential reports, the answers to these questions are crucial.

A significant bonus with our proficiency program is that it provides an ever-expanding file of slides accompanied by well-qualified differential reports. Each month, we add four new survey slides together with consensus reports for future use in teaching differentials to new technologists and residents. The slides also serve as study material for correcting deficiencies.

Finding appropriate slides for training purposes used to be a major chore. That alone makes the two or three hours it takes to manage the monthly proficiency program time well spent.
COPYRIGHT 1984 Nelson Publishing
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1984 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:white blood cell
Author:Yapit, Martha K.
Publication:Medical Laboratory Observer
Date:Nov 1, 1984
Words:1813
Previous Article:It's time to computerize urinalysis.
Next Article:Best of '84: the winners of MLO's article awards contest.
Topics:


Related Articles
Quality control in the new environment: automated hematology.
'Diff/if': a differential policy that works.
Development of a peripheral smear differential review.
Internal proficiency testing for hematology.
A rule-based system for cost savings in hematology.
Streamline your automated hematology laboratory.
Streamline your automated hematology laboratory.
Giant platelets.
Creatinine clearance blood sampling, differential with a normal CSF count, and cultures of intravenous catheter tips. (Tips from the Clinical...
Table of critical limits.

Terms of use | Copyright © 2016 Farlex, Inc. | Feedback | For webmasters