Printer Friendly

An operational approach to competency assessment.

The requirements of CLIA and JCAHO have focused labs' attention on the need for a formal system of competency assessment and documentation.[1,2] According to CLIA's standard for personnel assessment ([paragraph]493.1713), "The laboratory must have an ongoing mechanism to evaluate the effectiveness of its policies and procedures for assuring employee competence and, if applicable, consultant competence."[1] Although most labs have evaluated the performance of their employees as a matter of good practice for a long time, many have not had an objective system in place that satisfies regulatory agencies as well.

A basic tenet of good laboratory practice is that individuals may perform only those duties for which they have demonstrated ability. Continuing assessment is only one portion of a continuum of steps to ensure competence. Other portions include initial training and education, appropriate selection of individuals for jobs, specific orientation and training for those jobs, and continuing education and inservice.

How our program works

We set out to design a program that would take advantage of methods and tools already in place while adding few additional assessments. We also wanted to ensure the ongoing competence of all staff, from nontechnical personnel to senior professionals. Our assessment techniques, when possible, have been drafted to minimize extra work, to be nonthreatening and acceptable to our staff, and to be ongoing rather than episodic. Where we could, we incorporated in our competency procedure assessment documentation and remedial measures.

Compared with other hospital programs based on the testing or observation of individuals, our competence program relies on the use of aggregate data, in our case data collected, for a large group of individuals, mainly through our proficiency survey program and daily quality control. Only when a problem is found is it necessary to examine these data to identify an individual. In some instances (for example, to test individual cognitive skills such as cytologic screening or surgical pathology diagnosis), we conduct individual assessments using unknown specimens.

CAP proficiency surveys. We have modified our review of proficiency testing (PT) results to incorporate an assessment of testing and clerical competency. When we receive the final results of the proficiency survey, the appropriate supervisor reviews this information and notes unacceptable numbers, due to analytical or clerical error, on the survey face sheet. This cover sheet includes basic documentation (survey name, managers' names, dates) and a space for problems, solutions, and trends.

If we find errors, we use a simple checklist to analyze and document causes [ILLUSTRATION FOR FIGURE 1 OMITTED]. In it, the supervisor answers, "If unacceptable responses are due to human factors, is there a trend or pattern related to competency of an individual?" If said ability is an issue, the supervisor takes and documents appropriate measures (e.g., counseling,).

Daily QC, As part of our normal QC, we review data periodically, usually monthly, to detect out-of-control trends. In addition to investigating causes for excessive variation due to reagent and instrument problems, we consider human factors. If we find such factors related to an individual's competence, the supervisor takes and documents appropriate measures. Counseling is a good example of such a response.

Examining unknowns. Cytotechnologists and pathologists are challenged through the use of slides. We participate in the following programs:

* ASCP CheckPath Surgical Pathology, Hematopathology, and Cytology

* ASCP Check Samples in Anatomic Pathology

* CAP PIP (Performance Improvement Program) in Surgical Pathology, Autopsy Pathology, and Cytology

* AFIP (Armed Forces Institute of Pathology) HQAP (Histopathology Quality Assessment Program), a program for VA and military labs that uses unknown slides

* VA cytology proficiency slides, which uses actual case material representative of a lab's daily workload.

The senior professional staff member responsible for the quality assurance (QA) program maintains data for each individual in these programs.

Direct observation. Our phlebotomists are observed semiannually by peers or supervisors. We use a checklist to guide our assessment and maintain documentation.

Consultations and second review of cases. As a matter of good practice, most pathologists seek consultation from each other as well as from outside experts. For surgical pathology cases in our lab, this averages from 10% to 15% of cases. We compare the cytologic or surgical pathology provisional diagnoses with the consultant's diagnoses. In other cases, we may review an earlier diagnosis when we obtain subsequent material from a patient.

When cases are reviewed for a second review or when a case has been seen in consultation, the reviewing pathologist prepares a consultation report. We document these second reviews and consultations as part of the case's SNOMED coding (new procedure code numbers 0656 and 0657) and maintain it in the lab computer system. We summarize, analyze, and review these data quarterly, bringing discrepancies to the attention of the individual and his or her professional director.

In addition, a group of representative surgical pathology and electron microscopy cases are sent to AFIP and the VA Electron Microscopy group for quarterly review. This review includes an assessment of technical quality, report readability, completeness, and correct diagnosis.

Delta checks. Our lab computer reports delta checks - significant changes in consecutive lab values for individuals - to the instrument operators at the time of analyses. Periodically a supervisor reviews the list of delta checks captured by the computer for appropriateness of technologist response (such as repeat testing of a sample). This method has proved effective in detecting instances of specimen mix-up or mislabeling. When we detect a technologist problem, such as apparent failure to follow confirmation procedures, the supervisor initiates and documents appropriate remedial action.

Phoning critical values. We have a computer function that records critical values that have been flagged. The supervisor reviews documentation stored in the lab computer system of the phoning of critical values. This way, the supervisor identifies appropriate technologist response.

Accurate transcription. The pathologist reviews pathology reports before release. This is an effective review of the transcriptionist's ability. If the pathologist finds a problem, he or she brings it to the attention of the transcriptionist. Because of the closeness of the department, the supervisor quickly becomes aware of persistent problems.

Written quizzes. Each year, the lab manager gives the staff a written quiz covering general lab policies and procedures. This quiz is primarily educational; documentation of scores is not maintained. Supervisors review incorrectly answered questions with staff to ensure employees understand and comply with policies and procedures.

Ancillary testing. We apply the same procedures for reviewing PT and QC data to testing in ancillary sites.

Simple and nonthreatening

Figure 2 lists the assessment tools used in our program. As we set up the program, both accreditation consultants familiar with the intent of the agencies' requirements as well as our employees' union were helpful in allowing us to develop an effective, simple-to-administer program.

Figure 2

Assessment tools

* CAP proficiency surveys

* Daily quality control

* Examination of unknowns

CheckPath AFIP HQAP Check sample VA cytology proficiency slides

* Direct observation of phlebotomists

* Consultations and second review of cases

Cytology Surgical pathology AFIP review of surgicals Correlation of frozen section diagnosis vs. final pathological diagnosis Correlation of cytology with surgical pathology specimen report

* Review of reports typed by transcriptionist

* Review of delta checks

* Review of critical values calls

* Written quizzes


1. Medicare, Medicaid, and CLIA Programs; Regulations Implementing the Clinical Laboratory Improvement Amendments of 1988 (CLIA), Department of Health and Human Services, Health Care Financing Administration. Federal Register. Feb. 28, 1992; 57: 7184.

2. Joint Commission on Accreditation of Health Care Organizations. 1996 Comprehensive Accreditation Manual for Hospitals. Oakbrook Terrace, Ill: JCAHO; 1995; 389, 395-396.

A member of MLO's Editorial Advisory Board, Daniel M. Baer is professor of pathology at Oregon Health Sciences University and the Veterans Affairs Medical Center in Portland, Ore.
COPYRIGHT 1997 Nelson Publishing
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1997 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Baer, Daniel M.
Publication:Medical Laboratory Observer
Date:Feb 1, 1997
Previous Article:Self-empowerment: the way to go on the job.
Next Article:Clinical labs and other providers again target of pay squeeze.

Related Articles
The right way: staff training and competency assessment.
Filling the knowledge gap.
Competency assessment in a team-based laboratory.
Competency-based training: Evidence of a failed policy in training reform.
Selecting an internal candidate for CEO: this process will take some of the 'crap shoot' out of determining whether your internal CEO candidates are...
Standards-referenced assessment for vocational education and training in schools.
How finance can help build a leadership infrastructure: enforcing quantitative rigor on human resource costs and objectives is a role suited for...
Developing future program leaders: part 3.

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters