Printer Friendly

Pitfalls to avoid in bringing a new analyzer on line.

When a brand-new instrument gathers dust, poor planning may be the culprit. Follow these tips to prevent cobwebs from forming.

Have you ever acquired an expensive laboratory analyzer only to find installation endlessly delayed by red tape as the manufacturer's one-year warranty expires and the reagents bought to go with it approach their expiration dates? Do trainees forget the basic rules of operation before being given a chance to put them to use? Are technologists demoralized because the instrument they were promised remains unusable? Are months of depreciation expenses being charged to your laboratory budget?

If so, you are not alone. It is not uncommon for analyzers to remain fully or partially unused for six months, a year, or longer.

In general, laboratorians give careful thought to selecting analyzers. They devote a great deal of time and attention to the manufacturer's contract before signing. Yet poor planning prevents the new analyzer from being brought on line when it arrives, causing enormous expense to the laboratory. The solution is to include implementation planning in the instrument selection process.

In this article I will present some common pitfalls, most of which I have experienced personally, in implementation. I will then describe an approach that can help you prevent some pitfalls and deal with others.

* Typical problems. While some difficulties originate with the manufacturer, most occur within the laboratory and may be longstanding. The problems discussed below are summarized in Figure I.

* Supervisor turnover. A change in supervisory personnel may delay analyzer implementation for a prolonged period. The departing supervisor doesn't want to commit time to the process. That's not all bad; a departing supervisor should not take the leading role. The new supervisor will start the job by focusing attention on learning the lab's existing policies, procedures, and personnel. Instituting anything new will take lower priority. That's not bad, either; getting accustomed to both a new supervisor and a new analyzer at the same time may be too traumatic for the technical staff, for whom security and stability should be maintained.

* Competing priorities. All too often, a new analyzer arrives at a time when more urgent priorities supersede implementation of the instrument. A laboratory preparing for inspection by a regulatory or accrediting agency or responding to detected deficiencies after such an inspection may not be able to spare the managerial resources to devote to a new project. Similarly, major hospital or laboratory projects such as budget preparation and training programs may pull resources away from implementation of the new analyzer. These competing priorities, however, are usually known in advance and should be dealt with accordingly. Crisis management should not be the status quo.

* Manufacturer's problems with installation. We recently experienced a significant delay in the installation of a large chemistry analyzer, a new model, while the manufacturer identified and corrected problems with it. Someone at the company, it seemed, had slipped in a software upgrade--without making us aware of it. Experts were flown in; when they flew out, they still didn't understand what was wrong. Finally the head of the division at the company connected the problem with the software. Meanwhile, our old instrument was dying.

Tight schedule deadlines for analyzer implementation went unmet. The analyzer being replaced had major recurrent mechanical problems during this period. Additional key projects, including implementation of other analyzers, were deferred.

Even without these problems, the original schedule we had developed in consultation with the manufacturer was based on an ideal situation. A realistic schedule should have included contingency plans that allowed for delays. About 90% of the time, tasks for which the company had allotted a day or two actually required four.

* Inadequate staffing. It is unreasonable to engage in new activities, such as analyzer implementation, when too large a proportion of the staff is away. In one laboratory where I worked, most of the technologists took their vacations during the summer and when their children were home from school on holidays. That's not an unusual situation, but even where it isn't the case, laboratory sections often experience personnel shortages when more than one technologist is on medical or parental leave simultaneously. That's particularly true in small laboratories.

In some settings, a chronic staffing shortage makes it impossible to dedicate the necessary resources to new training. If an emergency suddenly arises--the old equipment actually falls apart, for example--it may be necessary to employ temporary staff members while the existing staff learns the ropes on the new instrument.

* Training time. Ease and speed of training is an important consideration for those deciding which instrument to obtain. Training time for operators depends on the complexity of the analyzer and may vary from a day or two, in which case training the entire technical staff is fairly easy to accommodate, to a week or two, a far more lengthy and expensive proposition.

* Computer interface. No installation problem seems so mysterious as the computer interface. You can't plug a computerized instrument into a wall as easily as a television set into an electrical outlet. Recently, we were told that our computer system couldn't store our new interface until we deleted old files. This led to several days' delay as we coordinated activities between our in-house computer specialist, who had other commitments, and the software vendor. Having that information earlier would have saved time and aggravation.

Most laboratories purchase the interface from the computer vendor. Occasionally a laboratory neglects to budget for the interface and must apply for additional funds from the appropriate committees. Even when planning was done, however, things don't always work out smoothly. Last year, for example, we received an estimate for an interface. Three months later, when we placed our order, the price had increased by $1,000. Considering the way prices have escalated, it wouldn't be a bad idea to build a cushion into the quoted price. If you are very lucky, the vendor will warn you before the price goes up.

Scheduling implementation and testing for the interface can take several months. Not uncommonly, software vendors receive enormous demands at certain periods of time, such as at the end of the year when manufacturers' sales forces are striving to meet sales quotas. Beyond that, it may take months for the software vendor to modify the interface to fit the individual laboratory's needs.

Making such modifications requires time and communication between the laboratory, in-house computer specialist, and computer software vendor. Last year, for example, the laboratory where I was working obtained a new hematology analyzer capable of performing an automated five-part differential. We needed to have the interface made compatible with the program for our existing backup analyzer, which provided a three-part differential. In the chemistry section, we wanted to be able to transmit bilirubin results of |less than~ 0.1 mg/dl rather than to have every 0.0 mg/dl result flagged as an error, thus holding up the entire system. In addition, as a cost-effective measure, we opted not to perform total bilirubin but to perform conjugated and unconjugated bilirubin on all specimens for which bilirubin was ordered. Explaining all this to the vendor and obtaining the modifications we needed took several months.

It is essential to schedule time with in-house computer specialists as far in advance as possible. They have other priorities that may well take precedence over implementation of a new instrument interface.

In one laboratory, we opted for a soon-to-be released bidirectional interface instead of obtaining the unidirectional interface that was then available. A year later, we were still entering more than 1,000 results manually each day because the bidirectional interface still wasn't ready for release.

Eventually the bidirectional interface was released, and worked. The laboratory had no legal recourse against the manufacturer regarding the delay, however, because we had accepted its verbal good faith claim. Lesson learned: Don't be afraid you'll offend the manufacturer by asking for all such promises in writing. Companies are much more willing to provide such documentation, including price guarantees, than they were in the past.

Information concerning control ranges, patient reference ranges, physician alert values, technical limits, verify and repeat limits (if applicable), and "canned" text must be entered into the computer system. This routine must be coordinated with actual implementation of the analyzer.

* Space. New analyzers are often larger than their predecessors. Analyzers are carefully padded and packed in crates. The shipped package may assume dimensions that cannot be accommodated by elevators or doorways. An institution near ours rents expensive cranes for moving large analyzers through a removed window. Each time large equipment needs to be moved in or out, they schedule the crane and budget accordingly.

Service representatives may need access through the top, back, or sides of the analyzer at any time during its life in the institution. Space must be provided to allow for such access and to permit exchange of boards and other components. When space must be modified after the analyzer arrives, unexpected and unbudgeted expense and delays usually occur.

In one laboratory, the building was unable to support the weight of an instrument at the location that was originally selected for it. An alternative site had to be found. Fortunately, this information was available long before delivery. A computer person was aware that the small addition on the building in which the instrument was to be housed lacked the necessary structural support. The lesson: If you're suspicious about a potential problem, fight the urge to ignore it. Following up may prevent headaches later.

* Utilities. Too often, water, electricity, ventilation, and lighting take low priority in planning. If not properly addressed, however, these and other ancillary supports can become an Achilles' heel for the entire project.

Some of the newer analyzers use continuously washed reusable cuvets rather than disposable ones. For this feature, a clean water hookup is required. A water filter company must be hired in advance to install the system. When this happened to us, we encountered a delay despite good planning. Although we had the water system installed before the analyzers arrived, it turned out that the wrong connectors had been used.

Another water-related issue is the installation of an appropriate drainage system. Drains must be at a height convenient for connection to the analyzer.

Some analyzers require dedicated 220-volt electrical lines. In one case, we notified the electricians the day after we became aware of these power requirements and a week before the scheduled installation. The electricians notified us that they could schedule the required work in one month's time. We resolved the problem by using an outside electrical contractor and paying for the unanticipated service from our general laboratory budget. In another case, electrical malfunctions kept recurring until a "line tamer"--an unbudgeted item--was purchased at a cost of approximately $1,000. We learned that for large expensive analyzers, line protectors are essential.

Virtually no attention tends to be devoted to ventilation. Large analyzers, especially in small enclosed rooms, often generate more heat than the ventilation system can accommodate. Even in larger rooms, enough equipment, refrigerators, and freezers can overload a ventilation system, especially on hot summer days. Early consultation with the facility's mechanical engineer can eliminate this problem. The engineer will need to know the number of British Thermal Units (BTUs) produced by each piece of electrical equipment in the laboratory space.

In one laboratory, an analyzer was placed against a wall. Overhead lights, however, existed only in the center of the ceiling; lighting for use of the analyzer was therefore inadequate. Special funds had to be requested and allocated before the maintenance department could add the needed lights.

* Correlation studies. A correlation study should have defined endpoints. Laboratories may perform correlation studies extended for many months with no organized pattern of activity and no predefined endpoints. The National Committee for Clinical Laboratory Standards has published excellent protocols on performing and evaluating correlation studies, including "User Comparison of Quantitative Clinical Laboratory Methods Using Patient Samples" (EP9-P, 1985). Membership in NCCLS is a valuable resource for laboratories.

A recent benefit provided by some manufacturers is the supplying of technical assistance for performing correlation studies over a period of several days or even a week. These studies must be predefined and directed to fit the laboratory's expectations in terms of controls and specimen selection. Recently, one manufacturer performed correlation studies and presented the results late on a Friday afternoon--immediately before leaving. Numerous problems were quickly apparent to us. We were left to resolve them.

Our expectation that the manufacturer would complete the studies was unfulfilled and false. We had failed to define the nature of the studies to be performed by the manufacturer in sufficient detail.

* Inadequate clinical input. A laboratory purchased a whole blood gas analyzer. The laboratory director spent extra money for a channel that would perform ionized calciums because she believed in the clinical utility of those studies. Unfortunately, she did this without contacting the clinicians she assumed would want that test available. They did not share her enthusiasm. Two years later, the ionized calcium channel remains unused. Moral: Ask the people who will actually use the equipment and the results before building in bells and whistles they may never use.

There is a natural tendency to suspect that hematology analyzers designed to perform automated differentials can't perform as well as advertised. These analyzers allow the user to set limits for flagging specimens for manual review. At first, operators may flag far too much for review; the result is a substantial waste of their time. Involvement by hematologists and other clinicians is essential to tighten the criteria and overcome these suspicions, allowing the instrument to do its job.

* Training. The last obstacle to be overcome before the new analyzer can be used is to train the technical staff. It is important to choose carefully in selecting the technologist to attend the manufacturer's training course. The technologist selected must demonstrate an interest in learning and an ability to teach others in the laboratory.

The opposite can occur as well. Two technologists we once sent for training returned so enthusiastic about our new analyzer that they quit the laboratory and became employees of the manufacturer. Now the laboratory had not only two new vacancies but also no technologists with advanced training on the equipment.

Sometimes employees return after the training session too overwhelmed with new material to be able to lead their coworkers. Technologists returning from advanced training need time to "play" with the analyzer before they engage in teaching sessions themselves. They must be given enough time to prepare the procedure manual. NCCLS document GP2-A, "Clinical Laboratory Procedure Manuals" (1984), can serve as a useful guideline. Without management's commitment to support all these activities, the project is doomed to stall.

Training centers of some manufacturers are booked months in advance. If it is essential to have technologists with advanced training in the laboratory before installation, scheduling should be done far in advance of that date. An alternative date for the training session may be selected for use if delivery or installation should be delayed.

Most manufacturers are willing to provide limited on-site training to facilitate implementation. This too must be scheduled in advance. The local training representative will have competing obligations to satisfy.

* Avoiding the pitfalls. For the most part, if you can identify a pitfall, you can avoid it. Yet it is surprising how often the same errors are made. The steps listed below are summarized in Figure II.

* Talk with other users. Identify other users of the same analyzer and same software vendor to learn from their experiences. Visit and talk with as many as you consider reasonable.

Be prepared with a list of both specific and open-ended questions. Discover what they did right and what they would do differently if they were doing it all again.

Manufacturers have always willingly provided lists of users whenever asked. At least three users should be contacted--more, if the responses to your questions are inconsistent.

Talk with colleagues in your lab who have had experience with instrument installations, including technologists who have worked at other facilities. Learn who might establish roadblocks. Anticipating such roadblocks gives you time to detour around or to bulldoze through the obstacles. Indeed, this entire exercise is based on the concept that identifying and planning to avoid roadblocks will permit smooth installation of a new analyzer.

* Coordinate activities. Share information concisely and promptly with all departments and individuals who need to know it. Before the final decision to purchase is made, everyone involved should be informed of the selection and given one more opportunity to comment. These include the in-house computer specialist, mechanical engineer, purchasing agent, administrator, laboratory manager, medical director, medical staff representative, and section supervisor. Only then should the manufacturer's representative receive the good news.

Someone with leadership ability--a section supervisor, laboratory manager, or medical director, for example--should act as group leader. That person will provide a focal point for coordinating activities and keeping the project on track.

Successful implementation is a multidisciplinary project, not a laboratory project alone. Schedules should list assignments and who is responsible for them. Meetings conducted as needed will keep everyone informed and involved. Between meetings, memoranda will report progress so that problems can be resolved quickly and community interest maintained. Extensive communication will encourage people to provide assistance when roadblocks are encountered.

* Don't rely on one individual. It is a mistake to rely totally on one person. All data should be shared. The ability to carry out any critical function should not be restricted to any one person. Otherwise the project is doomed to stall or fail if that individual should leave, become ill, or lose interest. At least one "understudy" should be assigned to all significant aspects of the project.

* Schedule installation to suit. Don't be intimidated by manufacturers' representatives who push you to install their analyzers immediately. If you anticipate a period of competing priorities, insufficient staffing, or unavailability of personnel important to the project, such as your computer specialist, postpone the installation to a more convenient time.

* Plan the interface early. Begin the dialog with your computer software vendor during the selection process or as soon as the instrument has been selected. Define the scope, schedule, and estimated costs of the project. To accelerate testing and installation of the interface, waste no time in stating any special modifications you will need.

* Design the space. Establish space and utility requirements. Make sure adequate provision has been made for delivery and physical access of the crated analyzer. Meet with the mechanical engineer to determine water, electricity, and ventilation needs.

* Outline a training program. Training on the use of new instruments is rarely sufficient. Don't let that be your stumbling block. Ten hours spent preparing a training manual can save 20 hours of training time. Construct checklists to assure that training of all operators will cover the same information.

Write periodic updates containing tips on maintenance, operation, and troubleshooting. Post charts to assist explanations of instrument use and new values and codes. Institute a vigorous program for training and retraining to maintain and advance technical skills. In all aspects, work with the manufacturer.

* Plan correlation studies with endpoints. Budget for correlation studies. Include contingency amounts for instances when portions of the studies must be repeated where the initial studies identified deficiencies.

* Budget for the unexpected. Always provide for contingency items. Nearly every instrument implementation requires the expenditure of additional monies to resolve an unanticipated problem. Examples include the sudden need for a new benchtop, electrical line, or surge protector; extra calibrators or consumables; modifications in the interface or mechanics of the analyzer; and airfare to the training course. To cover yourself, you might build into your total budget an amount equivalent to several percentage points of the original instrument purchase.

* Celebrate success. Nothing engenders continued support more forcefully than appreciation of the team that made the project succeed. As each important step is completed, stop and celebrate. Bring in a box of doughnuts. Send a memo to top brass in which you compliment specific staff members for their dedication. In performance evaluations, cite contributions each person made during the implementation. When you summarize quality assurance measures for in-house reports and those for the JCAHO, include improvements derived from the installation that went so smoothly thanks to superb planning and continuous monitoring.

Figure I

10 common pitfalls of instrument installation

Supervisor turnover Competing priorities Manufacturer's problems with installation Understaffing Computer interface Insufficient space Unacceptable utilities Lack of correlation studies Clinical input unsolicited Too little training

Figure II

10 ways to sidestep installation pitfalls

Visit users elsewhere Coordinate all activities Don't rely on one individual Match installation schedule to laboratory schedule Plan the computer interface early Design the space Create a solid training program Plan correlation studies with endpoints Budget for the unexpected Celebrate success

The author, Harvey W. Kaufman, M.D., is a staff pathologist at New England Memorial Hospital in Stoneham, Mass., and at Union Hospital in Lynn, Mass.
COPYRIGHT 1991 Nelson Publishing
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1991 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:acquisition of laboratory equipment
Author:Kaufman, Harvey W.
Publication:Medical Laboratory Observer
Date:Sep 1, 1991
Words:3489
Previous Article:Managing change in troublous times.
Next Article:Measuring performance and promotability of middle managers.
Topics:


Related Articles
Instrument sales growing despite Washington.
Selecting instruments for the Stat lab.
Selecting laboratory instrumentation.
Selecting laboratory instrumentation.
Reequipping the lab: a brisk pace of renewal.
Guidelines for POL instrument selection.
Dynamics of the lab-vendor relationship.
The role of a development lab in a hospital setting.
Creating a bar code chemistry system.
In pursuit of used equipment.

Terms of use | Copyright © 2017 Farlex, Inc. | Feedback | For webmasters