Printer Friendly

General quality practices in gynecologic cytopathology: findings from the College of American Pathologists Gynecologic Cytopathology Quality Consensus Conference Working Group 3.

Cytologic evaluation of cervicovaginal material via the Papanicolaou (Pap) test is generally considered the most successful laboratory test for cancer screening and prevention. Since the introduction of the Pap test in the 1950s, the incidence of cervical cancer has been estimated to have declined as much as 70%. There are many long-standing quality monitors that are intended to ensure optimal performance of cytopathology laboratories and their cytotechnologists and pathologists involved in interpreting cervical cytology in the United States. The majority of these monitors are derived from requirements by accrediting agencies and by the Clinical Laboratory Improvement Amendments of 1988 (CLIA '88). (1) Although most of these monitors are based on tradition and empirical good intentions, laboratories have never had an opportunity to provide feedback as to which monitors are useful and which are not. Additionally, there is no established standard or expectation as to how results of these monitors are used or should be used to improve quality. The College of American Pathologists' national survey of quality practices in gynecologic cytopathology and Gynecologic Cytopathology Quality Consensus Conference (GCQC2), projects funded by a contract from the Centers for Disease Control and Prevention, provide a unique opportunity to address this gap and gain a national perspective on current laboratory practices and opinions in this area. The purpose of this report is to review practices that are typically part of a general quality program in cytopathology laboratories and to provide statements regarding good laboratory practices based on the survey data, published studies, and expert opinion presented and voted on at the GCQC2 that laboratories may find useful in a quality assurance (QA) program in their own setting. The term good laboratory practice statement was carefully chosen in place of alternatives with potentially more overreaching implications such as consensus guidelines or preferred practices, and in place of more tepid alternatives such as consensus opinion, after much deliberation of the strengths and limitations of this study. The strengths of this study are many, and primarily consist of the large number of participating laboratories (more than 540), the input of expert cytopathologists and cytotechnologists, an enhanced Web-based portion, and a consensus conference to elicit further comments and opinion. The limitation of this study is that many of the findings and literature reviewed for the study are not supported by correlation with clinical outcomes. Recognition of this limitation led to the term good laboratory practice statement to acknowledge the strengths of the study while not overreaching into proscriptive regulations.

METHODS

The College of American Pathologists, with support by a cooperative agreement from the Centers for Disease Control and Prevention, conducted a national written survey of QA practices in gynecologic cytology. This survey consisted of 99 questions across 9 broad categories in gynecologic cytology QA practices. This written survey was distributed to all 1245 CLIA '88-certified laboratories in the United States. Of these, 596 laboratories responded, but only 541 survey responses were usable because of incomplete responses. For details of the complete process of this study, including the development of the survey, enhanced Web-based input, and culmination in a consensus conference, see the introductory article. (2) In short, expert cytopathologists and cytotechnologists were recruited to become part of 5 working groups that studied the survey data on different aspects of quality assurance. These working groups added follow-up questions to the survey that were made available online to elicit additional opinions. Evaluating the data and follow-up questions together with a review of the literature, the working groups developed a series of preliminary statements on good laboratory practices in cytology quality assurance, and presented these at the College of American Pathologists GCQC2 in Rosemont, Illinois, on June 4, 2011. Participants in the conference included working group members, representatives from national cytopathology and cytotechnology organizations, the Centers for Disease Control and Prevention, and individuals who accepted invitations after completing the written survey. Representatives from the working groups presented their draft statements to the audience participants, who voted electronically on the statements. The draft statements were based on trends identified from the written survey, additional material from the Web-based portion, literature when available, and expert opinion. The potential weakness of this approach is the lack of correlation with clinical outcome. The stratified consensus responses were categorized as agreement, 70%-79%; moderately strong agreement, 80%-89%; strong agreement, 90%-98%; and nearly complete agreement, 99%-100%.

RESULTS

Survey Results

Laboratories were asked to rank the most useful quality metrics in a gynecologic cytopathology quality assurance program (Table 1). Many metrics were cited as useful, with little differences among the top-ranking metrics. The least useful metrics were turnaround time, review of Pap tests diagnosed as negative for intraepithelial lesion or malignancy (NILM) when high-risk human papillomavirus test is positive, number of continuing education hours for cytotechnologists, mandatory annual gynecologic proficiency test results, and number of cytology-based continuing medical education hours for pathologists. Most laboratories, 60.4%, conduct an in-house review of gynecologic slides, and 88.2% of these do so to share interesting cases (Table 2). Other reasons for sharing cases include review of interlaboratory comparison program or educational programs from outside the laboratory (53.5%), to compare diagnostic criteria within the laboratory in an attempt to increase the interobserver reproducibility of diagnostic categories (46.2%), and to review selected cases as a result of quality monitoring (45.2%).

The majority of laboratories (88.8%) generate a QA report for cytology, most frequently (62.3%) on a monthly basis (Table 3). The most common elements in this report include diagnostic category rates, detection of epithelial cell abnormality on rescreen of Pap tests initially identified as NILM, Pap test cervical biopsy correlation rates, and turnaround time. When comparing metrics within a laboratory for individual cytotechnologists, most laboratories (81.1%) reported that cytotechnologists have an opportunity to compare their individual quality metrics against laboratory rate(s) as a whole, but less frequently have an opportunity to do so against other cytotechnologists and pathologists, 59.0% and 41.0% respectively(Table 4). A small majority (59.6%) of laboratories responded that individual pathologists have an opportunity to compare their quality metrics against the laboratory as a whole, but pathologists less frequently have an opportunity to do so against other cytotechnologists and pathologists, 40.1% and 48.1% respectively. Most laboratories, 70.0% and 73.4% respectively, responded that they do not code results to maintain confidentiality of either cytotechnologist or pathologist when comparing results (Table 5).

When laboratories identified variances in performance for quality metric statistics generated for the entire laboratory, they most commonly (84.6%) attempted to identify the root cause, address the cause, and continue to monitor (Table 6). Other less common corrective actions to address performance included conducting in-laboratory reeducation (41.6%), increasing real-time rescreen rate of NILM cases prior to sign-out (32.8%), and decreasing slide workload (31.5%). In the 2 years prior to filling out the survey, only 26.6% of laboratories applied corrective actions to address a laboratory-wide variance in performance (Table 7). Diagnostic rates and the rate of epithelial abnormalities detected on rescreen of NILM Pap tests prior to sign-out were the 2 most frequently cited metrics that led to corrective action, 47.4% and 36.1% respectively.

In the monitoring of variance in individual quality metrics, the methods used to identify variance in performance that requires intervention were similar for both cytotechnologists and pathologists (Table 8). The most common methods for cytotechnologists were simply identifying outliers (54.3%), user-defined action limits (47.1%), rate change in diagnostic categories or cervical biopsy Pap test correlation (37.4%), and comparing performance with standard deviation from laboratory's historical mean (21.6%). For pathologists, the most common methods for identifying variance in performance were identifying outliers (36.8%), user-defined action limits (26.8%), rate change in diagnostic categories or cervical biopsy Pap test correlation (23.0%), and user-defined action limits based on literature to decide when to investigate variance (11.4%). The use of standard deviation limits to identify variance in performance beyond the mean was uncommon, cited by only 12.3% for cytotechnologists and 5.3% for pathologists. When used, the most common standard deviation limit was 2 SDs (Table 9).

When a variance in performance was identified, corrective action was more likely applied to cytotechnologists than to pathologists (Table 10). For cytotechnologists, the most common corrective actions were to identify cause of variance and conduct a focused review and/or education (68.3%), increase the rate of rescreen of NILM cases prior to sign-out (52.8%), provide counseling and continued monitoring (45.7%), and decrease workload requirements (44.9%). For pathologists, the most common corrective actions were to identify cause of variance and conduct a focused review and/or education (29.1%), provide counseling and continued monitoring (15.7%), conduct in-house tutorial on diagnostic criteria (13.4%), and conduct an audit of previous cases from that individual (12.2%). In the 2 years prior to the survey, 18.3% of responders indicated that a corrective action was applied to a cytotechnologist and 2.5% indicated that a corrective action was applied to a pathologist.

When conducting an audit of cases as a corrective action, a sample of cases, instead of all cases, was most frequently used. For cytotechnologists, a small majority of laboratories (54.2%) chose a sample of cases, and did so most frequently for a month (Table 11). For pathologists, 67.9% of laboratories chose a sample of cases to monitor, and also did so most frequently for a month (Table 12).

Laboratories were asked about barriers to tracking quality metrics in gynecologic cytopathology. The most common response (52.3%) was that laboratories lacked sufficient personnel to track data (Table 13). That many of the required monitors were not practical and that there were too many monitors were also frequently cited, 49.2% and 46.6% respectively. Interestingly, 40.1% of the 474 laboratories that responded to this question cited that they have too few personnel for meaningful intralaboratory comparisons.

Follow-up Questions Posted on Internet Site

Nine additional questions, most open ended, were posted on a Web site in an attempt to supplement the written survey questions, which were often proscriptive in their choice of answers. The number of responses, ranging from 10 to 31, was low when compared to the number of responders to the written survey. Six questions had responses that allowed for an easy summary, and are presented.

When asked what method is most effective in monitoring performance, most (94.0%) found it helpful to monitor both individual and laboratory performance, a few (6.0%) found it helpful to monitor only overall laboratory performance, and none found it helpful to solely monitor individual performance. When asked why, the common themes in support of monitoring both individuals and the laboratory were, Monitoring individuals really allows you to target trends. Monitoring the lab as a whole provides a nice average or baseline" and "Monitoring both gives you a reference point when comparing an individual to overall lab performance."

Individuals were asked how they monitor performance in a laboratory when there is only one cytotechnologist. Of the 11 responders, 3 didn't know, and 2 who had experience in this situation stated, Retrospective review by pathologists; proficiency testing; cyto-histo correlation," and I've been that only one CT [cytotechnologist]--it's difficult. Rescreening is still the best tool we could use."

Individuals were asked to comment on their experience with identifying the root cause for individual performance problems and what actions help to identify the root cause. Of the 12 responders, 2 responses were particularly insightful: (1) Look at total numbers. Look at what the individual was doing that day. Check slide and prep quality. See if the error has cropped up before for that individual." (2) "Review work, discuss problem with individual, review diagnostic criteria with individual, monitor performance for improvement."

Individuals were also asked in follow-up how they tailor remedial action given the root cause and what steps they take. Of the 10 responses to this question, the 2 that best summarize the responses were (1) "Usually review of specific cases and 'mimics,' if it is a diagnostic issue. If not--we try to improve the process to prevent errors, like having CT [cytotechnologist] sign off on stain quality daily, etc" and (2) "... Generally, education at the multiheaded scope with supervisor and/or pathologist. Increased QC [quality control] on primary screening, reduction in daily screening numbers."

Finally, individuals were asked what they thought were the most essential elements of a QA plan. Of the 14 responders, one response is most succinct: (1) Having monitors that allow rapid redirection if needed. (2) Monitors that reflect the unique components of the laboratory (one size does not fit all). (3) Using education as the primary effort to address perceived quality issues."

Good Laboratory Practices: Consensus Statements

Table 14 lists good laboratory practice statements for general quality practices in gynecologic cytopathology. These statements were generated by data from the written survey, the Internet discussion site, and opinion of the authors. These were voted on at the consensus conference. Statements 6 and 7 were originally proposed at the GCQC2, and unlike many of the statements presented, they were not referenced in the original written survey or in the Internet discussion site.

COMMENT

CLIA '88 (1) mandates many metrics to monitor quality in gynecologic cytopathology. When laboratories construct their quality assurance plan for gynecologic cytopathology, these metrics need to be included and actively monitored in any QA plan. Other metrics should be monitored as deemed useful by the laboratory director. The articles stemming from the GCQC2 suggest manypractical qualitymetrics that may be beneficial for a laboratory to implement. For the purpose of ensuring general quality, our written survey demonstrates that the most frequent metrics and methods include cytologic-histologic correlation, retrospective review of NILM Pap test preceding a diagnosis of high-grade squamous intraepithelial lesion, immediate rescreening results, monitoring diagnostic rates, multiheaded review of difficult cases, review of NILM Pap test preceding a diagnosis of cervical intraepithelial neoplasia 2 or 3 on cervical biopsy, agreement between the cytotechnologist and pathologist interpretation of a Pap test prior to sign-out, and percentage positivity for high-risk human papillomavirus of Pap tests diagnosed as atypical squamous cells of undetermined significance. Many of these have been previously advocated. (3-12)

Monitoring laboratory metrics both for the laboratory as a whole and for individuals within the laboratory has been shown to be most helpful. (3) Laboratory-wide monitoring is applicable to many of the metrics listed above. Comparing these against national benchmarks may provide a baseline to identify and stratify laboratory performance, though laboratories need to consider whether their patient population and staffing are sufficiently similar to those of the group from which benchmarks are derived and whether the benchmarks correlate with meaningful clinical outcomes. Otherwise, meaningful comparisons cannot be made. For example, laboratories with extremely low volumes of Pap test (500 or less annually) may not find benchmarks for diagnostic rates to be useful because they will infrequently encounter an abnormal Pap test because of the low prevalence of abnormalities in the general US population. Similarly, a laboratory with a highly skewed population, such as one serving a university health service, may be expected to have higher rates of Pap tests diagnosed as low-grade squamous intraepithelial lesion than laboratories serving a general population. Some benchmarks may also be misleading. For example, a low ratio of atypical squamous cells of undetermined significance to squamous intraepithelial lesion is often thought of as reflecting precision in one's interpretation of Pap tests, but this may also indicate decreased sensitivity in detecting squamous intraepithelial lesion.

Comparing individual data to laboratory-wide data, either real time or historical, may help identify outliers within those laboratories that have more than one cytotechnologist. The participants at the GCQC2 felt strongly that individual statistics should be shared with individual cytotechnologists and pathologists. A small majority (65.6%) of the conference participants felt that metrics should be shared with an individual at least twice a year. This is a convenient interval, as it would allow the data to be available for the biannual review of cytotechnologist workload requirements mandated by CLIA '88. (1) In laboratories with one cytotechnologist or with an insufficient number of cytotechnologists for meaningful intralaboratory comparisons of individuals, monitoring individual performance is problematic. This was not addressed at the GCQC2, but some options include monitoring of abnormalities detected by retrospective review of Pap tests diagnosed as within normal limits, monitoring the rate of concurrence of cytotechnologist and pathologist of interpretations of Pap tests prior to sign-out by the pathologist, and comparison to published benchmarks with the caveats cited above.

Sharing data could be done either by publishing the data laboratory wide or by sharing the data privately with the individual. The written survey indicated that most laboratories did not code individual quality data to maintain confidentiality. The conference participants consider both coding data and not coding acceptable, and the choice to do so should be left to the discretion of the laboratory director. Regardless of the choice to code to maintain confidentiality or not, data should be clearly labeled as quality assurance when shared.

It is well known that just monitoring data tends to improve performance. This is ascribed to the so-called Hawthorne effect, whereby subjects under observation tend to perform better than unobserved subjects. In cytopathology laboratories with 3 or more cytotechnologists screening Pap tests, comparing an individual with the entire laboratory may help to identify areas where remedial action may be helpful. For example, if a cytotechnologist's low-grade squamous intraepithelial lesion rate is substantially higher than that of others in the laboratory, this would prompt investigation and possibly corrective action. The Pap tests diagnosed as low-grade squamous intraepithelial lesion by that individual could be reviewed or their rate of correlation with biopsy could be compared against the laboratory rate of correlation to investigate whether the individual is over calling LSIL. If their LSIL rate is substantially lower than others in the laboratory, then review of that individual's ASC-US rate and the rate of upgrades at sign-out by a pathologist of Pap Tests diagnosed as ASC-US to LSIL can be monitored to investigate whether the cytotechnologist is under calling LSIL. If a problem does exist, the individual could undergo remedial action such as an attempt to analyze the root cause or conducting in-house reeducation, both cited as the 2 most frequent methods to address variance by the written survey. The cytotechnologist's performance could also be monitored more frequently.

The most frequent method to identify individuals as outliers tends to be visual inspection of the data in a method that is reminiscent of the definition of pornography put forth by Supreme Court Justice Potter Stewart, I know it when I see it." One knows an outlier when one sees it as well. From the written survey, laboratories most frequently identified variance in performance by identifying an outlier or by arbitrary action limits. These approaches, although suitable for smaller laboratories with few cytotechnologists and low Pap test volume, are not as precise as control limits based on multiples of the standard deviation away from the mean. The latter method is particularly well suited for larger laboratories with many cytotechnologists, and allows for a more rigorous approach to analysis.

Detecting outliers maybe difficult in laboratories with a low volume of Pap tests, as metrics may vary wildly solely by chance. These laboratories may be better served by comparing performance of an individual against rolling historical averages of the laboratory for the metric in question. These laboratories may even have difficulty producing adequate numbers of abnormal Pap tests to ensure that their practitioners remain proficient at detecting and classifying abnormalities. Despite this supposition, there is no established numerical threshold below which screening Pap tests is deemed risky. However, we do know from proficiency testing results that pathologists in small laboratories who practice as solo screeners have the highest failure rate, notwithstanding differences in grading schemes between cytotechnologists and pathologists. Nonetheless, mandating a required annual volume in order to practice cytopathology is generally considered unreasonable, as many solo cytopathologists have passed proficiency tests and demonstrate a high level of quality in their cytopathology practices. Their situation is not unlike that of a solo primary care doctor who may need to refer patients with uncommon problems more frequently to a specialist for a second opinion. Similarly, laboratories with a low volume of Pap tests, with 1 to 2 cytotechnologists and a single cytopathologist, may need to submit problematic Pap tests for outside consultation when a consensus diagnosis is not obtained within the laboratory. These laboratories will also find it helpful to review and share interesting cases within the laboratory and to participate in external unknown Pap test challenges (survey programs) to ensure continuous exposure to lesions that may not be seen frequently in their daily work. In fact, multiheaded review of difficult cases was cited as one of the most useful methods in QA from the written survey, and was most frequently done to share interesting cases.

A similar problem may present itself in laboratories with high volumes of Pap tests but with low volumes of conventional Pap smears. With the wide acceptance of thin-layer methodology and human papillomavirus testing, conventional Pap smears are uncommon. Recently trained cytotechnologists and pathologists may have little experience with conventional Pap smears. At the GCQC2, it was felt that methodologies with relative low frequency, such as conventional Pap smears, may need a higher level of scrutiny, particularly given the differences between conventional and monolayer preparations. More than 91% of attendees at the consensus conference agreed that low-volume methodologies should receive some type of enhanced monitoring, with discretion for the laboratory director to decide the specific details of the enhanced monitoring. Some options include automatic rescreen or screening by designated experts within the laboratory.

The length of training and the number of slides screened by an individual in part determine the individual's proficiency in gynecologic cytopathology. Given the screening limits set forth in CLIA '88, (1) this may be a difficult task for some to achieve. Therefore, newly hired cytotechnologists just out of training will need monitoring and mentoring in their new position. The laboratory director should ensure that some type of extra monitoring takes place for newly hired professional staff. At the GCQC2, more than 73% of attendees agreed with this statement. The monitoring need not be excessive, and could range from retrospective or prospective review of a predetermined number of Pap tests screened and signed out by the cytotechnologist. In addition, monitoring concordance of interpretations of Pap tests submitted by the cytotechnologist to the pathologist for sign-out could also be helpful.

It is essential to remember that cytotechnologists and pathologists work as a team to detect abnormalities on Pap tests to prevent cervical cancer. This team-based process has been proven to be valuable and highly effective in the prevention of cervical cancer. A laboratory's quality program is an integral part of this process, and needs to be conducted in an open and constructive manner by all team members for maximal effectiveness.

This report was supported in part by a contract (GS-10F-0261K) funded by the Centers for Disease Control and Prevention/Agency for Toxic Substances and Disease Registry. The authors thank Barbara Blond, MBA, MT(ASCP), College of American Pathologists, and Maribeth Gagnon, MS, CT(ASCP), HTL, Laboratory Science, Policy and Practice Program Office, Office of Surveillance, Epidemiology, and Laboratory Services, Centers for Disease Control and Prevention, for their input and support to the workgroup during this project.

References

(1.) Clinical Laboratory Improvement Amendments of 1988 Final Rule. Fed Regist. 1992: 57:7001-7186. http://www.phppo.cdc.gov/clia/pdf/42cfr493_2004.pdf. Accessed November 1, 2011.

(2.) Tworek JA, Henry M, Blond B, Jones BA. College of American Pathologist Consensus Conference on Good Laboratory Practices in Gynecologic Cytology: background, rationale, and organization. Arch Pathol Lab Med. 2013;137(2): 158-163.

(3.) Jones BA, Davey DD. Quality management in gynecologic cytology using interlaboratory comparison. Arch Pathol Lab Med. 2000;124(5):672-681.

(4.) Anderson GH, Flynn KJ, Hickey LA, Le Riche JC, Matisic JP, Suen KC. A comprehensive internal quality control system for a large cytopathology laboratory. Acta Cytol. 1987;31(6):895-910.

(5.) Krieger PA, Naryshkin S. Random rescreening of cytologic smears: a practical and effective component of quality assurance programs in both large and small cytology laboratories. Acta Cytol. 1994;38(3):291-298.

(6.) Rohr LR. Quality assurance in gynecologic cytology. Am J Clin Pathol. 1990;94(6):745-758.

(7.) Mody DR, Davey DD, Branca M, et al. Quality assurance and risk reduction guidelines. Acta Cytol. 2000;44(4):496-507.

(8.) Krieger PA, McGoogan E, Vooijs GP, et al. Quality assurance/control issues: IAC task force summary. Acta Cytol. 1998;42(1):133-140.

(9.) Vooijs GP. Opinion poll on quality assurance and quality control: conducted by the committee on continuing education and quality assurance of the International Academy of Cytology. Acta Cytol. 1996;40(1):14-24.

(10.) Austin RM, Ramzy I. Increased detection of epithelial cell abnormality by liquid-based gynecologic cytology preparations: a review of accumulated data. Acta Cytol. 1998;42(1):178-184.

(11.) Davey DD, Nielsen ML, Frable WJ, Rosenstock W, Lowell DM, Kraemer BB. Improving accuracy in gynecologic cytology: results of the College of American Pathologists interlaboratory comparison program in cervicovaginal cytology. Arch Pathol Lab Med. 1993;117(2):1193-1198.

(12.) Davey DD. Quality and liability issues with Papanicolaou smear: the problem of definition or errors of false-negative smears. Arch Pathol Lab Med. 1997;121(3):267-269.

Joseph Tworek, MD; Ritu Nayar, MD; Lynnette Savaloja, CT(ASCP); Sana Tabbara, MD; Nicole Thomas, CT(ASCP); Barbara Winkler, MD; Lydia Pleotis Howell, MD

Accepted for publication May 25, 2012.

From the Department of Pathology, Saint Joseph Mercy Hospital, Ann Arbor, Michigan (Dr Tworek); the Department of Cytopathology, McGaw Medical Center, Feinberg School of Medicine, Northwestern University, Chicago, Illinois (Dr Nayar); the Department of Cytology, Regions Hospital, St. Paul, Minnesota (Ms Savaloja); the Department of Pathology, George Washington University School of Medicine, Washington, DC (Dr Tabbara);the College of American Pathologists, Northfield, Illinois (Ms Thomas); the Department of Pathology, Mount Kisco Medical Group, Mount Kisco, New York (Dr Winkler); and the Department of Pathology & Laboratory Medicine, University of California, Davis Health System, Sacramento, California (Dr Howell).

The authors have no relevant financial interest in the products or companies described in this article.

The findings and conclusions in this report are those of the authors and do not necessarily represent the views of the Centers for Disease Control and Prevention/Agency for Toxic Substances and Disease Registry and are not intended to take the place of applicable laws or regulations.

Reprints: Joseph Tworek, MD, Department of Pathology, St Joseph Mercy Hospital, 5301 E Huron River Dr, PO Box 995, Ann Arbor, MI 48106 (e-mail: Joetworek@yahoo.com).
Table 1. Quality Metrics Useful in a Quality Assurance Program

Metric                             No.   Average   5 (Very    4, %
                                          Rank     Useful),
                                                      %

Cytologic-histologic correlation   508      4.2       46.3    31.9
Multiheaded review of difficult    432      4.2       49.8    28.7
  cases
Retrospective review of NILM in    514      4.2       50.6    26.5
  current HSIL cases
Immediate rescreening results      452      4.1       49.3    24.3
Review of NILM Pap tests when a    465      4.1       45.2    28.8
  cervical biopsy is CIN 2/3
Monitoring overall diagnostic      496      4.0       36.9    34.9
  categories
Agreement between                  477      4.0       33.8    37.3
  cytotechnologist and
  pathologist
Comparing diagnostic category      469      3.8       28.1    34.8
  rates by individual
HR-HPV positivity rate in ASC-US   420      3.7       28.1    31.9
Turnaround time                    496      3.3       19.8    25.8
Review of NILM Pap tests when      351      3.2       17.9    25.4
  HR-HPV test is positive
No. of CE hours for                454      3.2       14.1    23.6
  cytotechnologists
Proficiency test results           498      3.2       24.1    19.7
No. of cytology-based CME hours    437      3.2       14.2    22.7
  for pathologists
Other                              16       4.4       56.3    37.5

Metric                              3, %   2, %    1 (Not
                                                  Useful),
                                                     %

Cytologic-histologic correlation    19.1   2.0        0.8
Multiheaded review of difficult     15.3   5.1        1.2
  cases
Retrospective review of NILM in     15.0   5.6        2.3
  current HSIL cases
Immediate rescreening results       18.1   4.4        3.8
Review of NILM Pap tests when a     17.6   5.6        2.8
  cervical biopsy is CIN 2/3
Monitoring overall diagnostic       20.4   5.6        2.2
  categories
Agreement between                   23.1   3.6        2.3
  cytotechnologist and
  pathologist
Comparing diagnostic category       26.7   5.5        4.9
  rates by individual
HR-HPV positivity rate in ASC-US    25.2   8.3        6.4
Turnaround time                     31.0   12.9      10.5
Review of NILM Pap tests when       29.3   14.8      12.5
  HR-HPV test is positive
No. of CE hours for                 37.7   16.5       8.1
  cytotechnologists
Proficiency test results            24.1   15.3      16.9
No. of cytology-based CME hours     37.5   15.8       9.8
  for pathologists
Other                               6.3    0.0        0.0

Abbreviations: ASC-US, atypical squamous cells of undetermined
significance; CE, continuing education; CIN, cervical
intraepithelial neoplasia; CME, continuing medical education;
HR-HPV, high-risk human papillomavirus; HSIL, high-grade
squamous intraepithelial lesion; NILM, negative for
intraepithelial lesion or malignancy; Pap, Papanicolaou.

Table 2. In-House Review

                                             Frequency    %
Laboratory conducts an in-house review of
  gynecologic cytology cases?

  Yes                                             313    60.4
  No                                              205    39.6

What is the purpose of the in-house case
  review? (N = 314) (a)

  Share interesting cases                         277    88.2
  Review of interlaboratory comparison            168    53.5
    program or educational programs from
    outside the laboratory
  Compare diagnostic criteria within the          145    46.2
    laboratory in an attempt to increase
    precision of diagnostic categories
  Review cases selected as a result of            142    45.2
    quality monitoring
  Review of laboratory-generated study             47    15.0
    material
  Review cases in light of published               25     8.0
    diagnostic criteria
  Other                                             5     1.6

(a) Multiple responses allowed.

Table 3. Quality Control Report

                                                  Frequency     %
Laboratory generates a quality control report
  for cytology?

  Yes                                                  459    88.8
  No                                                    58    11.2

How frequently is the report generated?

  Daily                                                 11     2.4
  Weekly                                                 5     1.1
  Monthly                                              291    62.3
  Quarterly                                             79    16.9
  Semiannually                                          45     9.6
  Annually                                              35     7.5
  Other                                                  1     0.2

Which elements are included in the report?
  (N = 457) (a)

  Diagnostic category rates                            389    85.1
  Detection of epithelial cell abnormality on          362    79.2
    rescreen of Pap tests initially identified
    as NILM
  Pap test cervical biopsy correlation rates           291    63.7
  Turnaround time                                      283    61.9
  Rate of retrospective rescreen results of            263    57.5
    negative Pap tests in current HSIL
  Pathologist-cytotechnologist discordance             258    56.5
    rate
  Pathologist upgrade of CT result                     215    47.0
  Amended reports                                      202    44.2
  Pathologist downgrade of CT result                   194    42.5
  Staining quality                                     184    40.3
  Proportion of HR-HPV positive in ASC-US              182    39.8
    cases
  Proficiency test results                             180    39.4
  Rate of retrospective rescreen results of            135    29.5
    NILM Pap test when current cervical
    biopsy demonstrates CIN 2/3
  Corrective actions based on quality                  116    25.4
    metrics
  Outside consultation discrepancies                    88    19.3
  Registration errors                                   71    15.5
  No. of CE hours for cytotechnologists                 68    14.9
  Retrospective review of NILM cases with a             46    10.1
    current HR-HPV test
  Other                                                 30     6.6
  No. of cytology-based CME hours for                   26     5.7
    pathologists

Abbreviations: ASC-US, atypical squamous cells of undetermined
significance; CE, continuing education; CIN, cervical intraepithelial
neoplasia; CME, continuing medical education; CT, cytotechnologist;
HR-HPV, high-risk human papillomavirus; HSIL, high-grade squamous
intraepithelial lesion; NILM, negative for intraepithelial lesion or
malignancy; Pap, Papanicolaou.

(a) Multiple responses allowed.

Table 4. Quality Metric Comparisons

                                               Cytotechnologists

                                                Frequency   %
Individual cytotechnologists have an
  opportunity to compare their quality
  metrics against other...

  Yes                                             275      59.0
  No                                              191      41.0

Individual pathologists have an opportunity
  to compare their quality metrics against
  other...

  Yes                                             169      40.1
  No                                              252      59.9

                                                  Pathologists

                                                Frequency    %
Individual cytotechnologists have an
  opportunity to compare their quality
  metrics against other...

  Yes                                              169      41.0
  No                                               243      59.0

Individual pathologists have an opportunity
  to compare their quality metrics against
  other...

  Yes                                              215      48.1
  No                                               232      51.9

                                                  Laboratory
                                                   Rates (a)

                                                Frequency    %
Individual cytotechnologists have an
  opportunity to compare their quality
  metrics against other...

  Yes                                              300      81.1
  No                                                70      18.9

Individual pathologists have an opportunity
  to compare their quality metrics against
  other...

  Yes                                              226      59.6
  No                                               153      40.4

(a) Laboratory rates without specifying individual rates, either
confidentially or not.

Table 5. Confidentiality of Results

                                               Frequency    %

Cytotechnologist's results are coded to
  maintain confidentiality

  Yes                                             143      30.0
  No                                              333      70.0

Pathologist's results are coded to maintain
  confidentiality

  Yes                                             124      26.6
  No                                              343      73.4

Table 6. General Quality Practices

                                            Frequency    %

When variation in performance is
  detected for statistics generated for
  the entire cytology laboratory, which
  actions may occur to improve
  performance? (N = 473) (a)

  Attempt to find the root cause, address      400      84.6
    the cause, and continue to monitor
  Conduct in-laboratory reeducation            197      41.6
  Increase real-time rescreen rate of          155      32.8
    NILM cases prior to reporting
    (increase negative QC)
  Decrease slide workload                      149      31.5
  Retrospective rescreening of a defined       100      21.1
    number of previous NILM cases
  Require outside reeducation (CME/CE           56      11.8
    courses)
  Other                                         25       5.3

Abbreviations: CME/CE, continuing medical education/continuing
education; NILM, negative for intraepithelial lesion or malignancy;
QC, quality control.

(a) Multiple responses allowed.

Table 7. General Quality Practices

                                                Frequency   %

In the past 2 years, has the laboratory has
  applied corrective actions as a result of
  monitoring quality metrics?

  Yes                                              136     26.6
  No                                               375     73.4

Active review of which quality metrics
  resulted in the corrective action for the
  cytology laboratory? (N = 133) (a)

  Diagnostic rates                                  63     47.4
  Rate of epithelial cell abnormality               48     36.1
    detected on rescreen of Pap tests
    initially identified as NILM
  Pathologist-cytotechnologist discordance          38     28.6
    rate of Pap test result
  Turnaround time                                   36     27.1
  Rate of retrospective rescreen results of         30     22.6
    NILM Pap tests in a current HSIL Pap
    test
  Rate of retrospective rescreen results of         21     15.8
    NILM Pap test when current cervical
    biopsy demonstrates CIN 2/3
  Pap test cervical biopsy correlation rates        20     15.0
    HR-HPV positive rates in ASC-US cases           20     15.0
  Proficiency test results                          20     15.0
  Other                                             14     10.5
  No. of gynecologic CE hours for                    6      4.5
    cytotechnologists
  No. of gynecologic cytology-based CME              5      3.8
    hours for pathologists

Abbreviations: ASC-US, atypical squamous cells of undetermined
significance; CE, continuing education; CIN, cervical intraepithelial
neoplasia; CME, continuing medical education; HR-HPV, high-risk
human papillomavirus; HSIL, high-grade squamous intraepithelial
lesion; NILM, negative for intraepithelial lesion or malignancy; Pap,
Papanicolaou.

(a) Multiple responses allowed.

Table 8. How Laboratory Monitors Individual Performance and Identifies
Variance in Quality That Prompts Further Evaluation or Action
(N = 473) (a)

Action                          Cytotechnologist     Pathologist

                                 Frequency    %     Frequency   %

Identify outliers                   257      54.3     174      36.8
User-defined action limits to       223      47.1     127      26.8
  decide when to investigate
  variance (arbitrary
  judgment)
Rate change in diagnostic           177      37.4     109      23.0
  categories or cervical
  biopsy Pap test correlation
Compare performance with SD         102      21.6      44       9.3
  from laboratory's
  historical mean
User-defined action limits           83      17.5      54      11.4
  based on literature to
  decide when to investigate
  variance
SD limits                            58      12.3      25       5.3
Other                                37       7.8      22       4.7
Compare performance with a           24       5.1       5       1.1
  percentage deviation from
  laboratory's historical
  mean (b)

Abbreviation: Pap, Papanicolaou.

(a) Multiple responses allowed.

(b) Percentage deviation from laboratory's historical mean ranged
from 2% to 70% with a median of 5%.

Table 9. Standard Deviation Limits

                                   Frequency    %

Cytotechnologists

  1 SD beyond mean performance         9       16.7
  2 SDs beyond mean performance       43       79.6
  3 SDs beyond mean performance        2        3.7

Pathologists

  1 SD beyond mean performance         7       29.2
  2 SDs beyond mean performance       17       70.8

Table 10. Performance Variance Actions

                                 Cytotechnologist     Pathologist

                                  Frequency    %     Frequency    %

Actions taken when a variance
  in performance is identified
  (N = 477) (a)
Identify cause of variance in        326      68.3      139      29.1
  performance and conduct
  focused review and/or
  education for the individual
Increase the rate of negative        252      52.8       24       5.0
  rescreens prior to reporting
  for the individual
Counseling and continued             218      45.7       75      15.7
  monitoring (warning)
Decrease workload requirements       214      44.9       34       7.1
  for individual
Conduct in-house tutorial on         178      37.3       64      13.4
  diagnostic criteria
Conduct an audit of previous         147      30.8       58      12.2
  cases from that individual
Dismissal                             75      15.7       19       4.0
Require additional outside            72      15.1       38       8.0
  continuing education
Other                                 27       5.7       24       5.0

In the past 2 years, have any
  of the actions in the
  previous question been
  applied to an individual as
  a result of monitoring
  quality metrics?

  Yes                                 93      18.3       12       2.5
  No                                 283      55.6      280      57.7
  Not applicable                     133      26.1      193      39.8

(a) Multiple responses allowed.

Table 11. Audit Practices for Cytotechnologists (a)

                                              Frequency    %

If an audit of previous cases is performed,
  how many cases are audited during a
  defined time period?

  All cases                                      70       45.8
  A sample of cases                              83       54.2

Audit time period

  1 d                                             7        7.4
  2 d                                             3        3.2
  5-7 d                                           7        7.4
  10-14 d                                         5        5.3
  21 d                                            1        1.1
  1 mo                                           29       30.5
  2 mo                                            4        4.2
  3 mo                                           17       17.9
  5 mo                                            1        1.1
  6 mo                                           13       13.7
  1 y                                             6        6.3
  3-5 y                                           2        2.1

(a) The percentage of cases audited ranged from 5% to 100% with
a median of 20%.

Table 12. Audit Practices for Pathologists (a)

                                               Frequency    %

If an audit of previous cases is performed,
  how many cases are audited during a
  defined time period?

  All cases                                       17       32.1
  A sample of cases                               36       67.9

Audit time period

  21 d                                             1        4.5
  1 mo                                            12       54.5
  2-3 mo                                           3       13.6
  6 mo                                             3       13.6
  1 y                                              2        9.1
  3 y                                              1        4.5

(a) The percentage of cases audited ranged from 1% to 50% with a
median of 10%.

Table 13. Barriers to Tracking Quality Metrics in Gynecologic
Cytopathology

                                             Frequency    %

What are some of the barriers to tracking
  quality metrics in cytopathology? (N =
  474) (a)

  Not enough FTEs to track data                 248      52.3
  Many of the required monitors are not         233      49.2
    practical to monitor quality
  Too many monitor requirements                 221      46.6
  Not enough FTEs for meaningful                190      40.1
    intralaboratory comparisons
  Other (b)                                      76      16.0

Abbreviation: FTE, full-time equivalent.

(a) Multiple responses allowed.

(b) Other responses included laboratory information system
limitations (29).

Table 14. Consensus Good Laboratory Practice Statements: General
Quality Practices

                                                                %
1.   Selected metrics should be monitored individually, as
       well as globally for the laboratory.

       Agree with entire statement.                           95.92
       Only individual quality data should be monitored;       0
         no global monitoring.
       Only global laboratory monitoring; no individual        2.0
         monitoring.
       Disagree with entire statement (ie, quality data        2.04
         should not be monitored at all).

2.   Monitoring of selected metrics for individuals should
       include both cytotechnologists (CTs) and
       pathologists.

       Agree with entire statement.                           92.9
       Only cytotechnologist quality data should be            3.57
         monitored.
       Only pathologist quality data should be monitored.      1.8
       Disagree with entire statement (ie, individual          1.8
         quality data should not be monitored at all).

3.   Results of quality metrics should be shared with
       individual CTs and pathologists.

       Agree with entire statement.                           98.4
       Quality metrics should only be shared with CTs.         1.6
       Quality metrics should only be shared with              0
         pathologists.
       Disagree with the entire statement (ie, quality         0
         metrics should not be shared at all).

4.   Results of quality metrics should be shared at least
       twice a year with individuals.

       Agree with entire statement.                           65.6
       Time frame is too frequent.                             0
       Time frame is too infrequent.                           9.4
       Time frame should be left to discretion of the         25
         laboratory.
       Disagree with entire statement (ie, quality metrics     0
         should not be shared with individuals at all).

5.   Reviewing selected cases for educational purposes is
       a useful quality tool.

       Strongly agree                                         86.4
       Agree                                                  13.6
       Disagree                                                0
       Strongly disagree                                       0

6.   Additional statement: Low-volume methodologies should
       have a higher level of quality oversight/control.

       Yes, screened by designated experts.                   11.11
       Yes, automatically rescreened.                          6.7
       Both, screened by experts and automatically            20
         rescreened.
       Yes, I agree with statement, but left to discretion    53.3
         of laboratory.
       No, I do not agree with statement. Low-volume           8.9
         methodologies should not have a higher level of
         quality oversight/control.

7.   Additional statement: Newly hired primary screeners
       should be monitored, but best method(s) is unclear.

       Strongly agree                                         37
       Agree                                                  46.3
       Disagree                                               14.8
       Strongly disagree                                       1.8
COPYRIGHT 2013 College of American Pathologists
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2013 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Tworek, Joseph; Nayar, Ritu; Savaloja, Lynnette; Tabbara, Sana; Thomas, Nicole; Winkler, Barbara; Ho
Publication:Archives of Pathology & Laboratory Medicine
Article Type:Conference news
Geographic Code:1USA
Date:Feb 1, 2013
Words:6834
Previous Article:Monitoring and ordering practices for human papillomavirus in cervical cytology: findings from the College of American Pathologists Gynecologic...
Next Article:The role of monitoring interpretive rates, concordance between cytotechnologist and pathologist interpretations before sign-out, and turnaround time...
Topics:

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters