Printer Friendly

Development of a peripheral smear differential review.

Development of a peripheral smear differential review

Seven years ago I left the security of a hospital lab for the unknown: a group practice laboratory specializing in hematology testing for oncology patients. I was hired to supervise the main laboratory. My entire staff consisted of another MT and a phlebotomist.

The job quickly evolved into much more. Since the physicians had virtually no experience with labs or their day-to-day problems, I was soon tapped as a resource person for the group's two satellite facilities. The doctors wanted me to assess the technologists' skills and provide CE without spending any more money than necessary and, whenever possible, without leaving Dallas.

As the practice grew, our physicians became concerned about the consistency of results reported by the satellites. Diffs were mentioned a lot. QC is a challenge for any lab, but imagine trying to monitor technologists when the bench is 500 miles away.

It was a formidable task, but we have developed a quarterly peripheral smear differential review that allows us to keep tabs on technologists who seldom meet. Top-quality lab work continues for our far-flung patient population. Best of all, the format is flexible enough to accommodate all the new labs and clinicians regularly joining the group.

I'm proud of my staff, who are excellent laboratorians. We have very little turnover. I joined the lab as a supervisor of two. Just a few years later, I'm in charge of the entire laboratory operation, which seems to grow daily.

* Lab logistics. The Texas Oncology Reference Laboratory is part of a huge group practice devoted to the diagnosis and treatment of cancer. We currently have more than 40 physicians in 28 offices throughout the state. The main laboratory is located in Dallas; 18 of the offices have satellite labs, and the rest are served in an outreach system. The staff of phlebotomists and medical technologists includes 25 FTEs. The central office handles all hiring, firing, billing, and supply ordering. Physicians join the corporation as salaried staff members, but later may have the option of becoming partners.

Our laboratories are as diverse as their locations throughout this extremely large state. The smallest satellite serves a single physician; several labs operate only part time. Each laboratory has at least one medical technologist. In some cases the MT works two days a week at one part-time lab and the rest of the week at another. Typical equipment includes a cell counter and a fibrometer, which provides on-site coagulation testing and screening, blood counts, differentials, and such basic hematology procedures as reticulocyte counts and sed rates.

The outreach laboratories offer weekly or sometimes monthly service, depending on the patient load in that community. Since oncologists are rarely available in such locales, we sublet space from a willing physician and send a team of a physician, a nurse, and someone from the business office to spend the day collecting specimens from patients referred by internists in that area.

Our staff at the main lab in Dallas includes four medical technologists, a supervisor, and a laboratory manager. This lab serves eight physicians and an average of 50 patients per day. We also act as a reference lab for the other facilities and provide coverage for their medical technologists, which means that one of ours is usually on the road.

* Differential review. The quality control program dates back to 1984, when the other Dallas-based technologist and I began establishing formal controls for the various procedures. For example, I evaluated the instrumentation, looked at the comparison programs offered by the manufacturers of hematology controls, and enrolled all three labs in CAP's hematology proficiency program. This approach worked well for most tests, but the white blood cell differential--a critical test for an oncology lab--defied generalization.

That's not surprising. Two technologists can look at separate slides from the same patient and not always see the same thing. Yet our clinicians needed consistent results. Our main laboratory had an abundance of interesting abnormals; thus the logical solution was to share these slides with the satellite labs and compare our results.

Initially we used a simple and informal procedure: I sent a printout of the hemogram with each slide. The two technologists at the satellites read the slides and submitted the differential they would have reported. While participation was voluntary, these technologists looked forward to seeing such unusual smears. They rarely had this opportunity because the more demanding cases tended to be referred to the Dallas office.

The great diversity in reporting red cell morphology became apparent as I reviewed the results. Since the physicians covered for each other, traveling from office to office throughout the state, we had to construct a standardized reporting format. I immediately turned my attention to establishing the necessary reporting guidelines, essentially telling the technologists, "So many per field is a + 1; this many per field is a + 3."

The laboratorians appreciated the guidance. When they began asking for feedback on their performance, we replaced our "no news is good news" approach with handwritten notes. I assigned a code number to each technologist to insure anonymity and tallied the results on a single chart for easy comparison. (This early effort evolved into the report form shown in Figure I). I included a brief notation when the technologist might benefit from reviewing a particular slide.

The staff liked the new format but wanted even more information. I began to include notes on the patient's clinical status, when known, and tips on ways to report certain cells. The procedure gave me an opportunity to reinforce our reporting policies.

The review turned out to be an excellent tool for evaluating technologists' skills in reporting nucleated red blood cells, white blood cells, and red blood cell morphology. The column format listed each technologist's findings side by side. I could compare their results at a glance and quickly spot any weaknesses.

By 1987, it was standard operating procedure to supply the patient's diagnosis after the fact. The technologists enjoyed making their own presumptive diagnoses--and so did the oncologists! The review had now become an enjoyable continuing education exercise that seemed to foster a spirit of teamwork in close quarters.

The program steadily evolved. In reviewing our technologists' responses to a 1988 survey we did, for example, I realized that the hemogram results did not always correlate with the differential. A high MCV should translate into large red blood cells on the slide, but this was not always the case according to the technologists' findings.

This discrepancy prompted a discussion of how to use the review as an internal quality control check. If the printout and slides disagreed, perhaps the slide had been prepared improperly or was mislabeled. At this point I became more closely involved in each case and occasionally made suggestions. If the MCHC was low, I might comment that hypochromia was likely. With an extremely high MCV, on the other hand, macrocytosis should be reportable. The time had come as well to design a new review form to accommodate the group's expansion to 14 lab sites.

The corporation continues to grow--we have added three more labs in the past two years--and so does the review program. Every three months, each technologist receives a differential review package, which contains five Wright-stained, cover-slipped smears; hemogram results for each slide; and a response sheet. All slides are chosen from our patient population. Sometimes we throw in a normal smear for good measure. The program works as follows:

* Differentiating diffs. Our differential review program takes surprisingly little time to coordinate and costs virtually nothing beyond shipping expenses. Technologists at the central laboratory are always on the lookout for interesting cases. When they find an appealing abnormal--particularly an unusual white cell, red cell, or platelet morphology--they make 30 extra slides. That provides one slide for each technologist, one for my own review, and a few spares.

The completed slides are then logged and set aside until we have compiled slides from five different patients. The code number identifies both the survey and the patient. For example, the 1989 winter proficiency was our 19th survey. The slide reported in Figure I was coded "19-1," telling us that the results are from the first patient chosen for the 19th survey. The patient's hemogram results are noted in the logbook for future consultation; so is the differential reported by the Dallas laboratory.

A technologist is tapped to prepare the proficiency package. The tech takes a slide from each of the five groups and places them in a microscope slide mailer. The package includes a photocopy of the hemogram log for all five slides and a response sheet on which to record the results. The technologist initials the appropriate page in the logbook, notes the mailing date, and specifies the response date. After placing all the proficiency paraphernalia in a standard Styrofoam shipping container, the technologist turns the task over to our mailroom. Just before mailing, a supervisor checks one slide from each patient to make sure the quality of the stain is still acceptable and sets aside a full set for the laboratory manager.

The technologists typically have one month to return their response sheets. This may seem like an overly generous time frame for work that takes barely 30 minutes. Yet if one of our technologists is filling in at another lab, several days might go by before he or she returns to the home lab, where the proficiency package waits. Each technologist performs a white cell differential and a red cell morphology check on each slide, records the results, initials the response sheet, and notes the mailing date.

When the results arrive, I do a diff on each of the five slides in my set. I then transfer my responses and the technologists' findings to a master sheet (Figure I), which eventually becomes part of the formal report. With all the results on this handy sheet, I can scan the findings quickly and pinpoint any weak areas. The entire process takes only a couple of hours, but my schedule is hectic and a month generally goes by before I finish. (I prefer not to delegate this task in order to maintain the promised confidentiality.)

The computer-generated report lists all individual responses, coded so that each technologist can identify only his or her own results. I include the patients' diagnoses, any pertinent clinical information, and the expected peripheral smear findings for relevant disease states.

Let's say, for example, that one of the five patients has acute lymphocytic leukemia. I'll expect participants to have reported a high number of blasts. The red cell count should be low, since the presence of many blast cells in the bone marrow tends to crowd them out. Patients with acute leukemia are likely to be anemic. Their red cells contain less hemoglobin and perhaps fewer platelets than normal, again because of crowding in the bone marrow.

In reviewing a slide from a patient with hereditary spherocytosis, participants were advised to go over our policy regarding elevated MCHC results. This discussion offered an opportunity to reinforce that policy while correlating the hemogram results with the differential data. Another clinical summary referred participants to our procedure for correcting the WBC count in the presence of nucleated RBCs.

Ideally, each technologist's numbers should support such clinical facts. When they don't, I suggest the technologist review the slide. To help in this effort, the formal report includes clinical summaries detailing these signs. If the numbers are too far off or if a technologist consistently has trouble with blasts or anything else, I schedule a visit to Dallas so that we can work on the problem together. I visit each satellite lab at least once a quarter, and we invariably spend some time looking at diffs.

* Versatility. The diff review helps us assess employee skills, guides our retraining efforts, reinforces lab protocols, and provides information useful in preparing employees' annual evaluations. The diff review doubles as a continuing education program--one that we're sharing with more and more labs across the state. For example, the staff at an independent reference lab in a town where one of our satellites is located regularly reviews our slides. When our technologist finishes her proficiency exercise, she carries the slides down the street to that lab and passes along a copy of the report as well.

In the future, I plan to store information regarding diseases and their anticipated peripheral smear findings on floppy disks. This procedure will allow reports to be retrieved easily. Eventually I hope to develop a spreadsheet version of the form so that the push of a button will produce copies.

Our differential review has evolved from a fairly casual process to a regular program repeated consistently at defined intervals. It has enabled us to meet quality assurance goals, costs next to nothing, and is actually fun. How often can a laboratorian say that?

Jan Fouche was director of laboratory services, Texas Oncology Reference Laboratory, Dallas. She is now laboratory director at Children's Presbyterian Healthcare Center in Plano, Tex.
COPYRIGHT 1990 Nelson Publishing
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1990 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Fouche, Jan
Publication:Medical Laboratory Observer
Date:Oct 1, 1990
Words:2168
Previous Article:When duplicate testing leads to different results.
Next Article:The 'doer syndrome': major hazard for lab supervisors.
Topics:


Related Articles
Streamlining blood counts with a microcomputer.
kAn in-house proficiency survey for WBC differentials.
Is that new instrument a winner?
A QA plan to monitor charted lab results.
Internal proficiency testing for hematology.
Streamline your automated hematology laboratory.
Streamline your automated hematology laboratory.
Streamline your automated hematology laboratory.
Dealing with "false experts" and overconfidence.

Terms of use | Copyright © 2016 Farlex, Inc. | Feedback | For webmasters