Printer Friendly

Applying quality improvement tools in the transfusion service.


1. Define the application of five quality improvement tools.

2. Define the purpose of a Failure Modes and Effects Analysis.

3. Describe two effect error prevention strategies.

4. Explain three ways to describe a process.

5. Describe effect data display techniques.

The following article is adapted from a lecture presented at the ASCLS Annual Meeting, Chicago IL, July 2006.

Quality tools can be applied to a variety of situations from the manufacturing floor to the clinical laboratory. Some healthcare facilities embraced the quality improvement movement over a decade ago; others are just beginning to adopt their use. Quality tools facilitate problem solving and process improvement within a defined framework such as the simple Plan, Do, Check, Act (PDCA) or the more complex Define, Measure, Analyze, Improve and Control (DMAIC) framework used in the six sigma process. Quality tools are used to gather and display information, make decisions, determine the root cause of a problem, develop action plans, and measure progress. This article uses two problem areas in the transfusion service to illustrate the use of quality tools in the laboratory, however, these tools can be used in any section of the laboratory.


It is usually not possible for all laboratory staff members to attend the same training session. While multiple training sessions could be used to cover all the tools that would be expected to be used, a "just in time" training method works well when staffing does not allow for extended training sessions. This method is to teach one tool at a time just as it is going to be used. It takes only a few minutes to have an in-lab session to review how a specific tool is used. Inexpensive pocket guides, available through book stores or purchased on the Internet, can be used as easy references. (1,2) "The Memory jogger" follows the PDCA process and the Six Sigma Pocket Guide by Rath and Strong follows the DMAIC process. (3) The pocket guides summarize how the tools are used and provide graphics and examples of their use. Although staff may recall the name of a tool, guides help us recall some of the details of such tools as brainstorming, Pareto charts, or process diagrams. Frequently used quality tools and their applications are listed in Table 1.


We have been involved in quality improvement projects for a number of years. The examples in this paper are based on projects done at the University of Michigan Hospitals Blood Bank. Idea selection is critical to the success of a project. Staff commitment is first gained by involving all staff in project selection.


Idea generation

Brainstorming is the method used to generate ideas. A slight modification of the brainstorming process can be used to get a list of potential projects when not all the laboratory staff can attend a meeting at the same time. An easel with paper and markers may be used to record the ideas. Staff members should have several days to put any idea that comes to mind on the paper. The goal is to generate ideas; their value or feasibility will be assessed later. A sample list of ideas might include:

* unacceptable specimens

* laboratory workflow

* maximum surgical blood order schedule (MSBOS) for blood orders

* telephone calls from the operating room

* electronic crossmatch

* laboratory ambassador program

The next step is idea clarification. Staff members get a chance to ask questions. Sending an email to all staff acts to document the questions and answers as well as communicate to all shifts. To gain support for their favored projects, staff members may collect and present preliminary data with the goal of showing the significance and potential for improvement. Table 2 is a sample check sheet that could be used to collect information for rejected specimens. In order to display information staff may use a number of different charts or graphs. A Pareto chart is an excellent way to determine significance of the data and communicate the information. To determine significance, information collected about the reason for specimen rejection data and data by day of the week is displayed in Pareto charts. Figure 1 is a Pareto chart displaying the data by reason for rejection. Figure 2 displays the data by day of the week. In this case, the bars are of uneven height with hemolysis identified as the most significant reason for rejection. The Pareto chart by day of the week would results in bars of nearly equally height, indicating that the days of the week are not significant to this problem. Going back to the check sheet, it is evident that most of the rejected specimens are collected by nurses in either the operating room or the emergency room. Another Pareto chart could be constructed to show the location of collection that would demonstrate the significance of this information.


Voting begins after data has been displayed and information has been exchanged. The voting method is designed to gain consensus and avoid a win/lose scenario. Each person gets three votes. Someone can put all of their votes on one project or split them among two or three. This can be accomplished in rounds if there are a large number of ideas. The rounds stop when there are three to five ideas left.


To assist in making this decision a grid may be used as shown in Table 3. Each idea is rated based on its feasibility; cost and likelihood of success are obtained by consensus. To better assure a successful first project, projects that are less likely to succeed such as those involving many other departments or over which the group has little control should be avoided. A score is obtained by multiplying the ratings. In the example, the laboratory ambassador program and laboratory redesign scored the same. Since the laboratory ambassador program involved individuals outside the department, the project should be discussed with the department administration. This project could be expanded to a laboratory-wide program.

The final decision was made by vote after a discussion of the issues. The transfusion service laboratory workflow redesign was determined to be the first project with the assumption that only minor electrical changes would be needed. Implementation of a new specimen labeling system that generates labels at the bedside based on reading the patient wristband bar code to obtain labels with laboratory information system accession numbers was taken up as a laboratory-wide project.


Since not all staff members can participate in the project because of time and staffing resources, a team will be created to continue the project. Team members should be selected for their skills and influence. Communications to non-team members about progress and tools is critical. Email is effective for distributing meeting minutes and actions and an easel is useful for displaying charts and graphs. A picture is worth a thousand words so diagrams and charts should be used whenever possible. A task grid helps monitor progress. The tasks are listed on the left column with columns for the personnel assigned, date due, and completion date. While there are software packages for project management that can be used for complex projects, this simple tool is also effective in communicating and tracking progress (Figure 3).



In order to define the present workflow, a flow chart that describes the current process is created. With a laboratory diagram in hand, a map of the specimen's progress through the laboratory can be traced. A value stream map is a more complex description of the process that uses the path of the specimen and timing of actual performance steps to provide information such as the wait time before the step is performed, the amount of time each step takes, where the materials to perform the step are stored, and where the communications for each step come from. (4) Additional process information that may assist in workflow redesign includes:

* number of steps

* travel distance

* number of people performing the task

* sensitivity and specificity

* cost per test

* QC required

* reagent waste

All of these elements could be useful in comparing the new process to the current process.

One of the principles of process improvement is waste reduction. The value stream map defines the current process and allows identification of waste. There are many potential areas of waste in a process. Delays are waste; these may include the interval between when a specimen arrives and it is centrifuged, the time a specimen waits to be tested or placed on an instrument, or the time from test to completion to result verification. Another area of waste is unnecessary movement. A spaghetti diagram helps in identify unneeded movement in a process. Using a current floor plan of the lab, the movements needed to do one cycle of a process of the specimen are tracked in one color and personnel movements in another. The resulting diagram often resembles a mass of cooked spaghetti plopped on the floor plan. See Figure 4.


The objectives of the change need to be defined before the new workflow can be designed. The goals will drive decision-making during the redesign. The objectives may be to become more efficient, reduce the opportunity for errors, decrease response time, or all of these. The most streamlined process may fail to assure patient safety. Thus, a way to measure important elements of the process is essential. Error rates, turn-around time, and response time can be sampled to determine the current process capabilities and the capabilities of the new process.


Since one of the goals is to reduce the amount of walking to get specimens, equipment, and supplies, the new floor plan and specimen flow should minimize movement. A U-shaped work cell was a goal as this has been shown to be an effective design. Using an easel, a lab diagram that contained only the outside walls and immoveable objects was displayed. All laboratory staff were involved in the process of laying out the new work flow. Paper cut outs of the lab equipment and moveable furniture were arranged on the diagram until there was consensus on their location. A spaghetti diagram of the new specimen and personnel flow was prepared to compare the current process and the new process. Another goal was to reduce the number of processing steps. By identifying and eliminating redundant testing and ineffective inspection steps, the complexity of the process was reduced.


While the efficiency of a process is important, designing a process that reduces the opportunity for error is another important aspect of the change process. When processes are changed in the transfusion service, a risk analysis is required. There are a number of factors to consider in assessing the inherent risks in a process. These include the severity of the error, the frequency of the error, and the ability to detect the error. Table 4 shows one tool that is often used in healthcare, a failure modes and effects analysis (FMEA). This quality tool provides a mechanism to assess the relative risk of various possible system failures by rating the failure according to its severity and frequency and by the ability of the fault to be detected. In Table 5, risk priority number (RPN) is calculated by multiplying the ratings. Data from actual practice can be used to determine the frequency of a failure. In the sample FMEA, failures in specimen labeling were used to illustrate the use of a FMEA. Severity was rated at ten since such errors could lead to a hemolytic transfusion reaction. The current rate of mislabeled specimens received in the laboratory can be used to estimate the frequency. The most dangerous specimen in the transfusion service is the one that is perfectly labeled but the identifying information does not match the identification of the patient whose blood is in the tube. A detection rating score of ten was selected. One could argue that it is nine since there are patient histories and delta values that may clue the technologist that the specimen was mislabeled.



The FMEA identifies the potential for the new process to fail. In order to make a process safer, it is useful to perform a root cause analysis and reduce or eliminate the risk of the error occurring. To do this, the cause, rather than the symptom, of an error must be identified. For example, one cause for the wrong tube label is that staff members fail to follow the defined specimen labeling procedure. However, this is actually a symptom, and not the root cause. The root cause is that additional labeling materials are not available so that when the original label has been placed on the wrong tube, an unlabeled tube is carried to the nursing station for labeling.


Another tool for this purpose is the cause and effect diagram, or a "fishbone" diagram (Figure 5). It can be used following a specific event or to assess a process change. A fishbone diagram is used to chart the major influences that affect the outcome. In our example of mislabeled specimens, we would draw a rectangle at the right of the paper and state the problem: mislabeled specimens. To the left we would create the fish bones from a central spine and large bones labeled method, manpower, material and machinery or policies, procedures, people, and plant. The technique requires individuals developing the diagram to continue asking, "Why?" For example, "Why did the person mislabel the tube?" Answers may include she used a preprinted label that was left over from the patient who was in the room prior to this patient, he handed unlabeled specimens to a clerk who misunderstood the name of the patient, or labels were generated using the wrong patient's identification and the labels were not verified against the patient's identification band. Each one of these becomes a minor bone is the fish and the "Why" questions continue for each of the minor bones until the root cause is identified. The goal is to get at the cause, not the symptom of the cause. Some frequent causes of errors include:

* deficient procedures

* poor communication between workers

* inadequately trained workers

* conflicting interest of workers

* inadequately labeled equipment

* poorly designed equipment

* poor work practices

* unnecessary cautions and warnings

* complexity/information load

* physical requirements

* no knowledge of downstream result


Brainstorming can then be used to obtain potential methods of reducing labeling errors. The ideas generated can then be rated on a grid based on feasibility, cost, and likelihood of actually reducing the risk. While often used, risk reduction strategies such as adding an additional inspection step, (5) retaining staff, and taking disciplinary action are not the most effective strategies in reducing errors. When trained individuals are making errors, elimination of the error prone step and using a mechanical device are more effective than modifying procedures and providing additional training. (6)

There are a number of effective risk reduction strategies. Reducing the number of steps in the process, assuring each step adds value, automating processes with, for example, rules to cancel or order tests and auto-verification, and validated computerization of manual steps have all been effectively used. A revised FMEA can be prepared after the risk reduction strategy is implemented. In our specimen collection FMEA, producing specimen labels at the bedside should reduce the frequency of mislabel specimens and the RPN. The severity and detectability ratings do not change, but the RPN is significantly reduced (Table 6).



Implementation of the revised work flow or the new specimen collection process involves planning, writing procedures, validation of equipment and processes, staff training and competency assessment, and evaluating the validation evidence before "go live". Validation may include running the new process in parallel with the old, and test runs of the new process can be used to train staff.

Once changes have been made, the new process should be evaluated. Documented reviews made daily, weekly, monthly, or annually may be useful, depending on the scope and effects of the change. The frequency and seriousness of errors as well as staff suggestions for changes in the new design should prompt a review. Questions to ask are: "Were the goals of the change met?" and "What are the unintended good and bad consequences of the changes?"

Graphic displays of data make it easy to see at a glance what progress is being made. Bar graphs, line charts, run charts (monitoring of a single item in a line graph), and pie chart displays are all useful tools. Caution should be used in selecting the scale of bar graphs and line charts. Significant data can be hidden or small changes enhanced by modifying the scale. Figures 6 through 8 are examples of a bar graph, line chart, and pie chart. Figure 9 displays the effect of a scale change.


Once a quality project is completed and the change is stable, the search for additional improvements in the process begins again. Pressures to decrease risk, increase productivity, and reduce turn around time continue. Future projects may employ the tools described here, as well as other quality improvement tools found in the references and resource materials.




ABBREVIATIONS: DMAIC = Define, Measure, Analyze, Improve and Control; FMEA = failure modes and effects analysis; MSBOS = maximum surgical blood order schedule; PDCA = Plan, Do, Check, Act; RPN = risk priority number; SPC = statistical process control

INDEX TERMS: quality assurance; quality tools; risk analysis; workflow.


1. Joseph PJ. Design a Lean laboratory layout. MLO 2006;38:24-31.

2. Joseph PJ. Design Lean work cells. MLO 2006;38:24-32.


(1.) Brassard M, Ritter D. The memory jogger 11. Methuen MA:GOAL/ QPC;1992.

(2.) Brassard M, Ritter D, Finn L, Ginn D. Six sigma memory jogger II. Methuen MA:GOAL/QPC;2005.

(3.) Rath & Strong Management Consultants. Rath & Strong's six sigma pocket guide. Lexington MAAON Consulting Worldwide;2000.

(4.) Tapping D, Shuker T, Luyster T .Value stream management. New York NY Productivity Press;2002.

(5.) Craig DJ. Stop depending on inspection. Quality Progress 2004;37:39-44.

(6.) Rooney JR, Vanden Heuvel LN, Lorenzo DK. Reduce human error. Quality Progress 2002;35:27-36.

The Focus section seeks to publish relevant and timely continuing education for clinical laboratory practitioners. Section editors, topics, and authors are selected in advance to cover current areas of interest in each discipline. Readers can obtain continuing education credit (CE) through P.A. C.E.[R]by completing the continuing education registration form, recording answers to the examination, and mailinga photocopy of it with the appropriate fee to the address designated on the form. Suggestions for future Focus topics and authors and manuscripts appropriate for CE credit are encouraged Direct all inquiries to the Clin Lab Sci Editorial Office, ICInk, 858 SaintAnnes Drive, Iowa City IA 52245. (319) 354-3861, (319) 338-1016 (fax).

Suzanne H Butch MA CLDir(NCA) is administrative manager, University of Michigan Hospitals and Health Centers, Ann Arbor MI.

Linda A Smith PhD CLS(NCA) is the Focus: Immunohematology guest editor.

Address for correspondence: Suzanne H Butch MA CLDir(NCA), administrative manager, University of Michigan Hospitals and Health Centers, UH 2F22510054 1500 East Medical Center Drive, Ann Arbor MI 48109-0054. (734) 936-6861, (734) 935-6855 (fax).
Table 1. Quality improvement tool applications

 improvement Decisions Describe Cause
 technique problem analysis

Brainstorming X X
Flow chart X
Process diagram X
Check sheet X X X
Pareto chart X X
Pie chart X
Run chart X
Fishbone diagram X

Quality Develop
 improvement action Monitor
 technique plan progress

Brainstorming X
Flow chart
Process diagram X
Check sheet
Pareto chart X X
Pie chart X
Run chart X X
Fishbone diagram

Table 3. Decision grid

Idea Feasibility Cost Likelihood Total
 of improving

Unacceptable 5 2 2 20

Laboratory 5 3 5 75

MSBOS 3 2 3 30
 for blood

 calls from 2 2 1 4
 the OR

Electronic 5 2 4 40

 program 5 5 3 75

Scale 1-5 with 1 being low and 5 being high

Table 4. FMEA rating scale

Rating Severity Occurrence

10 Injure a customer or employee More than once
Bad per day

9 Illegal /regulatory Every three
 requirement to four days

8 Render product or service Once per week
 unfit for use

7 Cause extreme customer Once per month

6 Result in partial malfunction Once in three months

5 Cause a loss of performance Once in six months
 likely to result in a

4 Cause minor performance loss Once per year

3 Cause a minor nuisance, Once every one
 no loss to three years

2 Be unnoticed, Once every three
 minor effect on performance to six years

Good Be unnoticed, Once every six
 no performance effect to 100 years

Rating Probability Detection

10 >30% Not detectable

9 <30% Occasional units checked
 for defects

8 <5% Units are systematically
 sampled and inspected

7 <1% All units are manually

6 <0.03% Manual inspection with

5 1/10,000 Process is monitored
 through statistical process
 control (SPC) and manually

4 6/100,000 SPC used with an immediate
 reaction to out of control

3 6/million SPC as above
 with 100% inspection
 surrounding out of control

2 <3/100 million All units are
 automatically inspected

Good <2/billion Defect is obvious and can be
 kept from affecting customer

Table 5. FMEA specimen collection

Process Potential Potential Severity
step failure effects
 mode of failure

Specimen Wrong ID Hemolytic 10
collection band on transfusion
 patient reaction

Specimen ID Band Hemolytic 10
collection removed, transfusion
 replaced reaction

Specimen SOP not Hemolytic 10
collection followed: transfusion
 no verifi- reaction
 of tube ID

Post-specimen collection redesign FMEA

Specimen SOP not Hemolytic 10
collection followed: transfusion
 no verifi- reaction
 of tube ID

Process Potential Occurrence Current
step causes controls

Specimen ID not 7 Procedures
collection verified

Specimen SOP not 6 Procedures
collection followed

Specimen Active 10 Procedures
collection decision to
 skip step

Post-specimen collection redesign FMEA

Specimen Active 4 Procedures
collection decision to
 skip step

Process Detection RPN Recommended
step action

Specimen 5 350

Specimen 5 300

Specimen 10 1000 Bar coded
collection specimen
 labeling at

Post-specimen collection redesign FMEA

Specimen 10 400

Table 6. Improvement in mislabeled specimen errors
after intervention

Mislabeled Pre-change One month Three months
 specimens post-change post-change

Wrong label 25 3 0
Medical record 50 45 47
Name error 27 6 4

Figure 8. Pie chart illustrating mislabeling causes

Wrong label 14%
Medical record number 68%
Name error 18%

Note: Table made from pie chart.
COPYRIGHT 2007 American Society for Clinical Laboratory Science
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2007 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Butch, Suzanne H.
Publication:Clinical Laboratory Science
Geographic Code:1USA
Date:Mar 22, 2007
Previous Article:"Blood bank--up close and personal".
Next Article:Organizing the antibody identification process.

Related Articles
CLIAC meeting update, CDC's new home page, COLA's new accreditation program, and an update on waived testing (Clinical Laboratory Improvement...
Internal auditing made (practically!) painless.
Quality indicators of fresh frozen plasma and platelet utilization: three College of American Pathologists Q-Probes studies of 8 981 796 units of...
Mollison's blood transfusion in clinical medicine, 11th ed.
CTS standardization: from donor to recipient.
"Blood bank--up close and personal".
Organizing the antibody identification process.
The doctorate in clinical laboratory science: a view of clinical practice development.
Progress and challenges in transfusion medicine, hemostasis, and hemotherapy; state of the art 2008; proceedings.

Terms of use | Privacy policy | Copyright © 2018 Farlex, Inc. | Feedback | For webmasters