Printer Friendly

Trends in Cervical Cytology Screening and Reporting Practices: Results From the College of American Pathologists 2011 PAP Education Supplemental Questionnaire.

Cervical cytology has been the focal point for cytology innovation for the past 20 years, beginning with the adoption of the 1991 Bethesda System (TBS) for reporting cervicovaginal cytology. Since then, the practice of processing, screening, interpreting, and reporting Papanicolaou (Pap) tests has evolved to include adoption of liquid-based cervical sample collection with automated processing systems, implementation of automated screening devices, reflex testing of residual samples for human papillomavirus (HPV), and updated TBS reporting terminology to augment our improved understanding of HPV pathogenesis in cervical cancer and to complement clinical practice guidelines. There are few opportunities to assess and monitor national laboratory practices in cytology as they change over time, but the College of American Pathologists (CAP) interlaboratory comparison programs provide a unique opportunity to investigate changes in practices among participating laboratories by creating periodic surveys, referred to as supplemental questionnaires, for inclusion with the programs. The CAP also uses responses to those surveys to establish national benchmarks for diagnostic categories and other practice parameters. Completion of these surveys is voluntary. The CAP Cytopathology Committee administers regular gynecologic cytology glass-slide challenges to participating laboratories in the United States, Canada, and other countries through the Interlaboratory Comparison Program in Gynecologic Cytology and the Pap Proficiency Testing Program. The CAP can evaluate emerging trends in cervical cytopathology through analysis of the responses to the survey, including the adoption of new technologies, terminology, and reporting practices of participant laboratories. In 2011, we surveyed laboratories to determine current cervical cytology practices with an emphasis on workload, imaging systems, and the use of molecular tests in cervical cytology. Between 2012 and 2015, we have seen major shifts in screening, management, and prevention options and in the emergence of a US Food and Drug Administration (FDA)-approved HPV platform for primary screening for cervical cancer in the United States. The data collected from 2010 laboratory practices in gynecologic cytology provides a baseline from which to compare results from future practice surveys.

MATERIALS AND METHODS

Members of the Cytopathology Committee formulated a supplemental questionnaire, examining 2010 cytology practices, to accompany the February 2011 mailing of the Pap Interlaboratory Comparison Program in Gynecologic Cytology. The survey questions addressed perceived trends in gynecologic cytology based on newly available technologies, guidelines for cervical cancer screening, and recommendations for calculating Pap test screening workload for semiautomated devices. Questions were divided into the following general categories: demographic information, terminology and reporting, workload, quality assurance using image analysis, Pap test reagents, and testing for HPV or other disease biomarkers. All of the questions were reviewed by a biostatistician (R.J.S.) for statistical soundness. Eight of the 42 questions (19%) related to demographic information, and the remainder covered other topics. Participants were asked to submit data from the 2010 calendar year.

RESULTS

Overall, 1604 laboratories received the survey. Of these, 625 laboratories (39%) responded to the demographic portion of the survey, and 608 laboratories (38%) responded to the supplemental questions. Not every laboratory responded to every question. Multiple responses were allowed for many of the questions. Most respondents (40.9%; 254 of 621) were voluntary, nonprofit hospital laboratories; 16.6% (103 of 621) were regional or local independent laboratories; 8.5% (53 of 621) were proprietary hospitals; 8.4% (52 of 621) were university hospitals; 7.2% (45 of 621) were city, county, or state hospitals; 6.9% (43 of 621) were veteran's hospitals; 5.6% (35 of 621) were national or corporate laboratories; 3.1 % (19 of 621) were clinical, group, or doctor's office laboratories; 2.4% (15 of 621) were military hospitals, and 0.3% (2 of 621) were public health or nonhospital laboratories. Laboratories examined an average of approximately 27 000 Pap tests per year with the middle 80% (470 of 621) of respondents reporting 330 to 52 200 tests. The minimum number of Pap tests reported by participants was 330, and the maximum number of Pap tests reported was 1 187 059.

Among 576 responses to the question, "How do you report Pap tests with obvious low-grade squamous intraepithelial lesion (LSIL) and cells suspicious for high-grade squamous intraepithelial lesion (HSIL)?" most respondents (80.9%; 466 of 576) employed the term LSIL, cannot rule out HSIL (LSIL-H) (Table 1). Multiple responses were allowed for this question. Less-common interpretations included LSIL in conjunction with atypical squamous cells, cannot exclude high-grade squamous intraepithelial lesion (ASC-H) (9.7%; 56 of 576), HSIL only (9.5%; 55 of 576), LSIL only (8.5%; 49 of 576), or ASC-H only (5.7%; 33 of 576). Most laboratories reported the presence and absence of endocervical cells/transformation zone (EC/TZ) sampling (75.3%; 445 of 591); 18.8% (111 of 591) reported only the absence of EC/TZ, whereas 1.2% (7 of 591) reported only their presence, and 4.7% (28 of 591) did not mention EC/TZ status. When laboratories were asked what percentage of their Pap tests in 2010 lacked an EC/TZ component, 336 responded that approximately 16% (mean [SD], 15.6% [13.8%]) of Pap tests did not contain transformation zone components with the 10th to 90th percentiles ranging from 4% to 30%. Most laboratories (60%; 189 of 315) estimated that value rather than recording actual data (40%; 126 of 315).

Table 2 shows the data from questions related to cytotechnologist (CT) Pap test workload limits and how the workload is measured. Few laboratories (5.1%; 29 of 573) limited CT screening to a particular time during the work shift; most (94.9%; 544 of 573) did not. Of the 531 laboratories that reported the number of gynecologic slides screened per hour by CTs, the mean (SD) was 8.9 (3.5) slides (10th-90th percentile range, 5-12 slides, with a median of 9 slides). Most respondents did not use image-assisted screening instruments (67.7%; 394 of 582), but of those who did use image assistance, 62.2% (117 of 188) reported keeping workload records that distinguished between image-assisted slides and nonimage-assisted slides, and 37.8% (71 of 188) of those respondents did not distinguish between those 2 modalities. Of the laboratories that used image assistance, 89.3% (108 of 121) reported counting slides that require full manual review differently than those that do not require full manual review. Only 10.7% (13 of 121) of the respondents did not distinguish the workload limits when full manual review of image-assisted slides was performed. Of the respondents who performed image-assisted Pap tests, 74.3% (104 of 140) counted slides screened by image-assistance only (without full manual review) as 0.5 of a slide when calculating the total workload. Only 21.4% participants (30 of 140) counted image-assisted slides as 1.0 slides, and 4.3% participants (6 of 140) reported using some other value than 0.5 or 1 slide for the image-assisted slides not requiring further manual review. In 93.6% (132 of 141) of the laboratories, when a full manual review was required for an image-assisted Pap test, the CT who performed the original field-of-view (FOV) review also performed the full manual review. It was rarely reported that a different CT (2.8%; 4 of 141), a CT supervisor (2.8%; 4 of 141), or a pathologist (0.7%; 1 of 141) performed the manual review following the image-assisted FOV review. In most cases, as reported by 122 of 158 laboratories (77.2%), a full manual review was performed immediately after the initial FOV review, if required, using the imaging-system microscope. Less commonly, that review was performed immediately using a nonimaging system microscope (27.8%; 44 of 158), at a later time using a nonimaging system microscope (16.5%; 26 of 158), or rarely, at a later time using the imaging system microscope (3.2%; 5 of 158). Multiple responses were allowed for this question and some laboratories used more than one method.

Table 3 summarizes data about imaging systems as they pertained to quality assurance practices. The percentage of image-assisted Pap tests results requiring full manual review was reported by 129 laboratories. On average, 33.8% of slides required full manual review, with the middle 80% of respondents reporting review rates from 10% to 100%, with a median of 25%. The mean (SD) atypical squamous cells of undetermined significance/squamous intraepithelial lesion (ASC/SIL) ratio for image-assisted Pap tests, as reported by 133 laboratories, was 2.43 (2.81) (median, 1.6), as compared with a mean (SD) ASC/SIL ratio for nonimage-assisted Pap tests of 2.67 (3.74) (median, 1.7) as reported by 446 participants. Most respondents (74.1% [86 of 116] of the laboratories for image-assisted slides; 78.8% [312 of 396] of the laboratories for nonimage-assisted slides) used actual data for their answers; the remainder provided estimated values.

A series of questions on quality assurance activities for gynecologic cytology screening processes revealed that most respondents (96.5%; 573 of 594) did not perform rapid prescreening of Pap tests and did not use rapid rescreening of Pap tests (88.8%; 521 of 587) as quality measures. The remaining results are expressed in Table 3. We asked whether laboratories retrospectively reviewed slides on the imaging system, if available in the laboratory, to detect certain errors for quality assurance purposes. Multiple responses were allowed for this question. Most respondents (89.2%; 107 of 120) reported using the imaging system to retrospectively determine whether there were atypical cells present that were not displayed in the FOV review. Eighty-five percent (102 of 120) of the laboratories reported using the imaging system to determine whether a human locator error occurred, and 74.2% (89 of 120) used it to detect human interpretive error, whereby the cells were identified by the imaging system and marked by the CT but not deemed clinically significant. Most laboratories included imaged, negative Pap slides as part of their Clinical Laboratory Improvement Amendments of 1988 (CLIA)-mandated 10% negative review, both when slides did not require full manual review (93.6%; 147 of 157) and when they did (92.4%; 145 of 157). More than one-half of the respondents (58.8%; 100 of 170) used the imaging system to retrospectively review slides to determine whether there were abnormal cells in the field of view, while the rest (41.2%; 70 of 170) did not.

We asked several "yes" or "no" questions about reagents. Most laboratories (77.3%; 408 of 528) did not record the expiration dates of liquid-based cytology (LBC) Pap test vials that are sent to other clinics or facilities, but 55.4% (298 of 538) monitored the inventory of vials sent to these sites. Laboratories generally did not record the expiration dates of the LBC Pap test vials received (79.8%; 423 of 530). Once it was determined that a received LBC Pap test specimen was past the expiration date, 39.4% (184 of 467) rejected the specimen, 36.4% (170 of 467) performed a morphologic evaluation but included a disclaimer in the report, and 24.2% (113 of 467) performed a morphologic evaluation without a comment in the report.

Most laboratories (88.6%; 504 of 569) received requests for ancillary testing on LBC vials in addition to the Pap test. Although we asked a series of questions on ancillary testing for HPV, not all results are included. Table 4 shows responses to questions on ancillary testing. High-risk HPV testing was performed within the institution in 37.5% (215 of 573) of laboratories, with only 8.2% (47 of 573) of them performing the test in the cytology laboratory. Fifty-six percent (321 of 573) of laboratories sent out their high-risk HPV tests to a reference laboratory. Although most laboratories (57.7%; 295 of 511) did not offer additional tests other than HPV "off the vial," others reported offering the following nonmorphologic tests from residual specimen in the LBC vial (multiple responses were allowed): Chlamydia trachomatis (42.1%; 215 of 511), Neisseria gonorrhoeae (39.5%; 202 of 511), herpes simplex virus (7.6%; 39 of 511); bacterial vaginosis (4.1%; 21 of 511), and cystic fibrosis (2.7%; 14 of 511), among others.

COMMENT

In February 2011, the 1604 laboratories enrolled in the Pap Interlaboratory Comparison Program in Gynecologic Cytology program received the demographics and supplemental questionnaires. Six hundred and eight laboratories (38%) responded to most of the supplemental questions that were included in their first mailing of the Pap Interlaboratory Comparison Program in Gynecologic Cytology.

Terminology and Reporting

After the basic demographic questions, the first query on the survey asked laboratories how they reported Pap tests that showed obvious LSIL cells but also contained a few cells that might represent HSIL. Most respondents (80.9%; 466 of 576) used a term that was not part of the 2001 Bethesda System (TBS 2001) for reporting cervical cytology: low-grade squamous intraepithelial lesion, cannot exclude high-grade squamous intraepithelial lesion (LSIL-H). The TBS 2001 divides SILs into LSIL, HSIL, and SIL of indeterminate grade, but does not have a specific category for Pap tests with LSILs that show a few cells suspicious for a higher-grade lesion (LSIL+). Clinical management decisions are based on the risk of finding high-grade disease on follow-up biopsies; for LSILs on cytology, the cumulative risk of an underlying cervical intraepithelial neoplasm grade 2 or 3 is 11% to 16%, (1-2) whereas, for HSIL on Pap tests, the risk is greater than 50%. (3-6)

Although equivocal cytologic diagnoses should be minimized, the cytologic finding of a SIL of indeterminate grade does occur and needs to be addressed by laboratories to ensure proper clinical management. Most laboratories queried used an indeterminate interpretation category rather than choosing between a LSIL or HSIL interpretation for reporting rare, possible HSIL cells in a predominately LSIL Pap test. This finding supports the need for terminology that identifies this situation, so that women with LSIL who may have a higher-grade lesion receive colposcopies and possibly biopsies to exclude HSIL. Several studies (7-11) indicate that an LSIL-H interpretation has a higher likelihood of a cervical intraepithelial neoplasm 2 or 3 biopsy on follow-up than does LSIL alone.

A recent, large study by Zhou et al (12) concluded that LSILH had a distinct HPV genotype distribution that more closely mirrored the HPV genotype distribution of HSIL than it did LSIL. They found that the proportion of HPV-16 infections in women with LSIL-H (36%; 9 of 25) was comparable to their findings in HSIL (45%; 25 of 56) and was significantly different (P = .007) from LSIL (14%; 23 of 167) or ASC-H (0%; 0 of 9). Reporting the possible presence of HSIL in patients with LSIL may become increasingly important as more women with LSIL cytology are followed by colposcopic examination and repeat Pap testing without biopsies. As screening intervals increase, it is important to alert clinicians to the possibility of a high-grade lesion to ensure adequate surveillance. Although our survey showed that many laboratories use the terminology of LSIL-H in practice, there are strong arguments for maintaining a 2-tier reporting terminology that correlates with the biology of HPV infection and cervical carcinogenesis. Adding terminology, such as LSIL-H, might create a 3-tiered system that could negate the beneficial aspects of the 2-tiered TBS nomenclature. Furthermore, the 2012 American Society for Colposcopy and Cervical Pathology management guidelines (13) use LSIL and HSIL nomenclature without an intermediate category. Recent World Health Organization (14) terminology for reporting cervical lesions also adopts a 2tier system that mirrors TBS-LSIL or HSIL.

Reporting the presence or absence of a TZ component (endocervical or metaplastic cells) is recommended in TBS (15) and may help clinicians determine whether the intended target was adequately sampled, especially in cases where few HSIL cells are identified. The clinical significance of the absence of the EC/TZ component in a cervical cytology specimen has been controversial, and management decisions for follow-up vary. In our study, most laboratories (75.3%; 445 of 591) reported on the presence and the absence of the EC/TZ component, but 18.8% (111 of 591) laboratories reported only on its absence.

CT Workload

Because of national interest in the influence of factors, such as fatigue, on the ability of CTs to perform accurate Pap test screening, we asked laboratories if they limited screening to certain times of the day. Responses included 94.9% (544 of 573) of the laboratories that did not limit screening during a work shift, but 23 respondents (4%; 23 of 573) reported limiting screening to the first part of the work shift, and 6 participants (1%; 6 of 573) reported limiting screening to the second part of the work shift. The impact of CT workload productivity on the accuracy of screening is well recognized. (16-19)

In their study, Elsheikh et al (20) found significant differences in the detection rates of abnormal cases by CTs. Detection rates differed according to the time of the day and the day of the week for some CTs. Even though their study demonstrated that CT screening performance generally deteriorated during the second half of the day, laboratories may be unaware of the study or unable to change current shifts to provide CTs time away from screening. Those laboratories that do limit screening to the first or second part of the work shift may do so because of other variables. For instance, CTs may primarily assist with fine-needle aspiration procurement in the afternoons, leaving only the morning hours for Pap test screening. In our survey, most laboratories did not limit screening to a particular time during the work shift, but those that did limited screening to the first portion of the work shift. Further study is required to determine what processes laboratories use to divide workload or to restrict screening time.

Only one-third of the laboratories (32.3%, 188 of 582) surveyed used an image-assisted screening instrument, so these devices are not widely used. These instruments may be used primarily by high-volume laboratories to increase efficiency. On July 27, 2010, the FDA released an alert clarifying the workload recording for semiautomated gynecologic-cytology screening devices (21) and describing how laboratories can safely calculate workload for FDA-approved, semiautomated gynecologic-cytology screening devices. According to that alert, any slide that was reviewed on an image-assisted screening device using FOVs only was assigned a workload unit of 0.5. However, when a manual review was necessary, the slide was assigned an additional full unit of 1.0, thereby raising the workload unit of that slide to 1.5. Most laboratories (89%; 108 of 121) responding to our survey appeared to be aware of this alert and were counting the slides that required full manual review differently than those that did not. However, although three-fourths (74.3%; 104 of 140) counted image-assisted slide reviews that did not require full manual review as 0.5 unit, most of the other respondents (21.4%; 30 of 140) counted those image-assisted slides as 1.0 unit. Some laboratories may find it easier to count slides as one unit, regardless of the work required, when calculating CT workload. Respondents may also have misread the survey question because we did not ask a similar question about image-assisted slides that do require full manual review. The FDA approval for the ThinPrep Imaging System (Hologic, Inc, Marlborough, Massachusetts) allows for a maximum screening volume of 200 slides in 24 hours that are not to be screened in less than an 8-hour work day. (22)

The BD FocalPoint GS Imaging System (Becton, Dickinson and Company, Franklin Lakes, New Jersey), an automated, guided imaging system that screens conventional Pap smears and BD SurePath Pap tests, received FDA premarket approval for an individual CT to use the device to screen no more than 170 slides in 24 hours. (23) Only the FOV review can be examined for those slides to count as 0.5 units. Cytotechnologists are still restricted to 200 SurePath slides in 24 hours, screened in no less than an 8-hour work day, as required by CLIA for slides that have material covering less than one-half of the slide surface. Individual state regulations supersede CLIA regulations when they impose more-stringent guidelines limiting CT workload.

According to our data, CTs screened a mean (SD) of 8.9 (3.5) slides per hour, well within the CLIA requirements. It is not clear how laboratories determined that number, and our results were compiled from laboratories that count workloads in different ways. Of interest, the maximum number of gynecologic-cytology slides reportedly screened in 1 hour by one CT was 68! One can only hope that answer was an erroneous entry. Because of established guidelines and regulations, CT Pap test workload records should be kept to distinguish between image-assisted slides and nonimage-assisted slides. Approximately two-thirds of laboratories (62.2%; 117 of 188) that did use image-assisted screening instruments kept workload records distinguishing between image-assisted and nonimage-assisted slides. Most often, a full manual review was performed by the same CT who performed the FOV review (93.6%; 132 of 141), immediately following the initial review and using the same imaging system microscope (77.2%; 121 of 197). This suggests that CTs were assigned responsibility for particular slides and were expected to see the case to its completion. Another model would allow laboratories to have CTs performing only the FOV review on an imaging system and then passing the potentially abnormal slides to another CT for full manual review, but based on our data, this approach was not popular. Some CTs appeared to prefer to use a nonimaging microscopic for full manual review because 44.3% (70 of 158) responded that they reviewed those slides immediately or at a later time using a nonimaging system microscope. Multiple responses were permitted to this question, indicating that some CTs used different methods in the same laboratory. Laboratories that allow CTs to perform a full manual review on a separate, non-ThinPrep Imaging System microscope would not be in compliance with the FDA-approved method of review and would have to separately validate that process. In some laboratories, the imaging microscopes are separate from the CT's work space, and those CTs may prefer to use the microscope in their own work space for manual review. For the laboratories surveyed, a mean (SD) of 33.8% (28.4%) of the image-assisted Pap tests required a full manual review. That number included both abnormal Pap tests and mandatory 10% quality assurance review slides. The FocalPoint GS Imaging System received FDA premarket approval to allow 25% of imaged conventional or SurePath slide results to be released as normal without further human review, and of the remaining 75%, at least 15% must receive full manual review for quality control purposes. (21) There are no national or federal guidelines established for the number of ThinPrep Imaging System slides that must receive full manual review if they are otherwise interpreted as negative on initial FOV review. However, negative imaged slides would still be subject to the mandatory CLIA requirement for 10% prospective rescreening of negative slides. (24) Most laboratories included imaged slides that required full manual review (92.4%; 145 of 157) and that did not require full manual review (93.6%; 147 of 157) in their CLIA-mandated 10% negative review (Table 3).

Quality Assurance

Rapid prescreening has not taken hold in gynecologic cytology in the United States. Only 3.5% (21 of 594) of those surveyed used rapid prescreening, and slightly more respondents (11.2%; 66 of 587) rapidly rescreened Pap tests, but most laboratories did neither. Rapid prescreening (25) and rapid rescreening (26) have been proposed as cost-effective processes that could replace the 10% random review of negative for intraepithelial lesion or malignancy cases and prove more effective at detecting missed HSIL. The 120 laboratories that used imaging systems for quality assurance purposes were inclined to use them for retrospective review of gynecologic cytology slides to detect imaging (89.2%; 107 of 120), human locator (85%; 102 of 120), and human interpretive (74.2%; 89 of 120) error. Imaging errors are the presence of atypical cells that are not presented in the FOVs. Human locator errors are expressed as cells that are presented in the FOV but are not recognized and marked as significant by the CT. Human interpretive error is when the atypical cells are present in the FOV, recognized, and marked but interpreted as not significant. Fifty-nine percent (58.8%; 100 of 170) of laboratories used the imaging system, when applicable, on cases selected for retrospective review to determine if there were abnormal cells in the FOV. This is a powerful and effective method of screening for instrument error and can help cytology professionals determine interfaces between humans and instruments that are most prone to error. For instance, there may be abnormal cells, such as koilocytes, that fall between the FOVs. Reviewing negative Pap tests in cases with a cervical intraepithelial neoplasm 1 biopsy result, using the imaging instrument to demonstrate selected FOVs, may reveal that the most diagnostically significant cells were not included in the FOVs. When professionals become aware of specific instrument limitations, it alerts them to potential errors and encourages additional diligence or processes to prevent those errors. Of 157 laboratories, most included negative imaged Pap tests in the CLIA-mandated 10% negative review both when they did not require full manual review (93.6%; 147 of 157) and when they did (92.4%; 145 of 157). This probably occurs because the selection of cases for 10% prospective review must include slides selected randomly from negative cases. (24)

The mean (SD) percentage of image-assisted gynecologic slides that had full manual review was 33.8% (28.4%) for 129 responding laboratories with imaging systems. The mean ASC/SIL ratio for image-assisted cases was 2.43, with 75% (100 of 133) of laboratories reporting ASC/SIL ratios under 2, whereas the mean (SD) for nonimage-assisted Pap test slides from 446 laboratory respondents was slightly higher (2.67 [3.74]), with 90% (401 of 446) of laboratories reporting ASC/SIL ratios below 4.1. The difference between these two means was not statistically significant (P = .60; t test). Whether slides were imaged or nonimaged did not have an effect on the mean ASC/SIL ratio for laboratories, but these results may be flawed because of the inclusion of both actual and estimated data and inclusion of data from different imaging systems that select abnormal cells through different algorithms. Renshaw et al (27) investigated the ASC/ SIL ratio as a monitor of a CT's screening sensitivity when correlated with other statistics and reported that laboratories using location-guided screening were less likely to have CTs who have ASC/SIL ratios less than 1.5 (1 of 20; 5%) than those without imaging systems. The CT ASC/SIL ratios did not correlate with volume of slides, workload, or preparation type (conventional versus LBC). Our data suggest that the ASC/SIL ratio is affected by the use of imaging systems, but more-robust studies in this area would be necessary to confirm this hypothesis because of the variables discussed above.

Reagents

Four survey questions dealt with LBC reagent-collection vials for cervical cytology relating to inventory control and to the fixative expiration date printed on each vial as provided by the company. These questions were posed because items relating to reagent storage, labeling, expiration, and lot verification have recently been added to the CAP Accreditation Program checklists. (28)

We explored whether laboratories monitored the inventory of the LBC vials sent to clinics and other facilities sending Pap test samples to the laboratory. Inventory control, as it relates to the stability of reagents, is mandated by CLIA29 and is addressed by multiple declarative statements (COM.30300, COM.30350, COM.30400, and COM.30450) in the CAP Accreditation Program All Common Checklist. (28) These statements address reagent labeling, storage, expiration date, and new reagent-lot verification, respectively. The intent of these statements is to ensure that laboratories have established procedures to ensure specimen integrity from the clinical services submitting specimens and to ensure the stability of reagents necessary to perform the tests within the laboratory. Most laboratories (55.4%; 298 of 538) responded that they did monitor the inventory of vials that they sent to clinics and facilities. This may be a service provided by the laboratory to ensure that clinics are appropriately stocked with necessary submission receptacles, and if laboratories included the cost of the vials in their test price, this may also be a prudent mechanism to prevent waste of resources. It would not be cost effective to allow clinics to stock dozens of unused vials that expire and are never submitted with a sample. Laboratories that did not provide LBC vials, but that required clinics to purchase their own vials, would have had no incentive for monitoring clinical inventory.

Three questions addressed the expiration date on the collection vials. ThinPrep vials have a shelf life of 2 years, (30) and the shelf life of BD SurePath vials is 3 years. (31) The CAP All Common Checklist, (28) statement COM.30400, requires confirmation that all reagents are used within their indicated expiration date. Most laboratories (77.3%; 408 of 528) did not record expiration dates of LBC Pap test vials that were sent to clinics, even though that would be a reasonable action to ensure that clinics did not receive expired reagent. It is not clear whether the remaining laboratories did not monitor expiration dates at all, did not send expired vials to clinics, did not record the dates, or simply did not send vials to clinics and, therefore, did not need to monitor reagent expiration dates on the vials. We did not provide a "not applicable" response for these questions. Remarkably, most laboratories (79.8%; 423 of 530) did not record the expiration dates of LBC vials that they received from clinical practices. Some of them did reject the specimen outright (39.4%; 184 of 467) if the vial had expired or provided a disclaimer about expiration in the final report (36.4%; 170 of 467), but one-quarter of laboratories (24.2%; 113 of 467) simply reported the results. This set of findings implies that laboratories were aware that a specimen was submitted in a vial that had expired but did not record the number of those vials or where they were received from, even though they may have rejected the specimen or commented on the specimen status in the report. It may be that CTs and pathologists noticed that the morphology of cells was not impaired in expired vials and, therefore, believed that it was a safe practice to report morphologic findings. Only 22.7% (120 of 528) of the laboratories did record the expiration date of vials sent to clinical practices. Even fewer laboratories recorded the expiration date on the patient sample vials received from the clinics (20.2%; 107 of 530). This is significant because the shelf life of vials containing a patient sample is considerably shorter than the virgin vial. For example, the shelf life of a SurePath vial with a patient sample is 4 weeks at room temperature or 6 months at -17[degrees]C to -13[degrees]C (2[degrees]F to 8[degrees]F), (31) and it is 6 weeks from collection at room temperature for samples in ThinPrep medium. (30) Therefore, a specimen submitted in a vial with expired reagent would not be considered viable, and a patient sample in a medium with an expiration date after its vial expiration date would have a shorter expiration interval than expected. We did not ask about specimen expiration as it relates directly to HPV testing, but that is one area where adherence to the manufacturer's expiration dates may be critical. Some studies (32-35) show that cervical samples have viability for HPV testing beyond manufacturer's recommendations for some preservatives and HPV testing types, but testing of expired specimens under those circumstances should be approached with caution. Castle et al (34) showed that, after several years of storage in a methanol-based preservative at ambient temperature, the nuclear detail and b-globulin DNA of cervical cells deteriorated even though HPV DNA detection by Hybrid Capture 2 (Qiagen, Valencia, California) was not affected. Further studies would be helpful to provide clear guidance on specimen stability in other mediums. Laboratories seeking to extend specimens past recommended expiration dates should internally validate specimen viability before testing.

Ancillary Testing

Ancillary testing, primarily testing for high-risk HPV, has become common practice, with 88.6% (504 of 569) of laboratories receiving requests for ancillary tests on LBC Pap test vials. The most common additional out-of-the-vial tests offered were for Chlamydia trachomatis (42%; 215 of 511) and Neisseria gonorrhea (40%; 202 of 511).

Only a few laboratories in our study performed testing for herpes, bacterial vaginosis, cystic fibrosis, hepatitis C virus, human immunodeficiency virus, Epstein-Barr virus, or the evaluation of genomic amplification of the human telomerase RNA component (TERC) gene analysis on specimen remaining in LBC vials (Table 4). The TERC component (extra copies of chromosome arm 3q that cause additional telomerase genes) can be detected by fluorescence in-situ hybridization on LBC specimens and has been associated with invasive cervical carcinoma, (36) but that test was only offered by one laboratory. These findings indicate that, as a general rule, health care providers are not requesting studies other than HPV and other sexually transmitted diseases from residual Pap test specimens.

In conclusion, 2010 practice patterns in gynecologic cytology among participants of the CAP Pap Proficiency Testing Program and Pap Education Program showed that laboratories using TBS 2001 for reporting results reported the presence and absence of endocervical component on Pap tests and had adopted the term LSIL-H. Automated screening with imaging devices had not become common, but where they existed, most laboratories made use of the instruments for quality-improvement purposes. Rapid prescreening and rescreening had not been significantly adopted for quality assurance in cytology in the United States. Additionally, most laboratories did not limit screening to a particular time during a work shift. Usually, laboratories counted imaged Pap tests differently than manually screened Pap tests for workload capture, but confusion still existed on the proper means of reporting workload on imaged cases. In general, laboratories were not very vigilant about LBC vial expiration, for either prespecimen or postspecimen collection. In the past decade, there have been significant changes in practice in cervical cancer screening and prevention in the United States and internationally. These data will provide a useful baseline for future assessment of practice patterns of laboratory cervical-cancer screening.

References

(1.) Burks HR, Smith KM, Wentzensen N, et al. Risk of cervical intraepithelial neoplasia 2+ among women with a history of previous treatment for cervical intraepithelial neoplasia: ASCUS and LSIL Pap smears after treatment. J Low Genit Tract Dis. 2011; 15(1):11-14.

(2.) Cox JT, Schiffman M, Solomon D; ASCUS-LSIL Triage Study (ALTS) Group. Prospective follow-up suggests similar risk of subsequent cervical intraepithelial neoplasia grade 2 or 3 among women with cervical intraepithelial neoplasia grade 1 or negative colposcopy and directed biopsy. Am J Obstet Gynecol. 2003; 188(6):1406-1412.

(3.) Jones BA, Novis DA. Cervical biopsy-cytology correlation: a College of American Pathologists Q-Probes study of 22 439 correlations in 348 laboratories. Arch Pathol Lab Med. 1996; 120(6):523-531.

(4.) Castle PE, Cox JT, Schiffman M, Wheeler CM, Solomon D. Factors influencing histologic confirmation of high-grade squamous intraepithelial lesion cytology. Obstet Gynecol. 2008; 112(3):637-645.

(5.) Evans MF, Adamson CS, Papillo JL, St John TL, Leiman G, Cooper K. Distribution of human papillomavirus types in ThinPrep Papanicolaou tests classified according to the Bethesda 2001 terminology and correlations with patient age and biopsy outcomes. Cancer. 2006; 106(5):1054-1064.

(6.) Nogara PR, Manfroni LA, da Silva MC, Consolaro ME. The "see and treat" strategy for identifying cytologic high-grade precancerous cervical lesions among low-income Brazilian women. Int J Gynaecol Obstet. 2012; 118(2):103-106.

(7.) Nasser SM, Cibas ES, Crum CP, Faquin WC. The significance of the Papanicolaou smear diagnosis of low-grade squamous intraepithelial lesion, cannot exclude high-grade squamous intraepithelial lesion. Cancer. 2003; 99(5): 272-276.

(8.) Elsheikh TM, Kirkpatrick JL, Wu HH. The significance of "low-grade squamous intraepithelial lesion, cannot exclude high-grade squamous intraepithelial lesion" as a distinct squamous abnormality category in Papanicolaou tests. Cancer. 2006; 108(5):277-281.

(9.) Owens CL, Moats DR, Burroughs FH, Gustafson KS. "Low-grade squamous intraepithelial lesion, cannot exclude high-grade squamous intraepithelial lesion" is a distinct cytologic category: histologic outcomes and HPV prevalence. Am J Clin Pathol. 2007; 128(3):398-403.

(10.) Shidham VB, Kumar N, Narayan R, Brotzman GL. Should LSIL with ASC-H (LSIL-H) in cervical smears be an independent category?: a study on SurePath specimens with review of literature. Cytojournal. 2007; 4:7.

(11.) Ince U, Aydin O, Peker O. Clinical importance of "low-grade squamous intraepithelial lesion, cannot exclude high-grade squamous intraepithelial lesion (LSIL-H)" terminology for cervical smears: 5-year analysis of the positive predictive value of LSIL-H compared with ASC-H, LSIL, and HSIL in the detection of high-grade cervical lesions with a review of the literature. Gynecol Oncol. 2011; 121(1):152-156.

(12.) Zhou H, Schwartz MR, Coffey D, Smith D, Mody DR, Ge Y. Should LSIL-H be a distinct cytology category?: a study on the frequency and distribution of 40 human papillomavirus genotypes in 808 women. Cancer Cytopathol. 2012; 120(6):373-379.

(13.) Massad S, Einstein MH, Huh WK, et al; 2012 ASCCP Consensus Guidelines Conference. 2012 Updated consensus guidelines for the management of abnormal cervical cancer screening tests and cancer precursors. J Low Genit Tract Dis. 2013; 17(5)(suppl 1):S1-S27.

(14.) Stoler M, Bergeron C, Colgan TJ, et al. Squamous cell tumours and precursors. In: Kurman RJ, Carcangiu ML, Herrington CS, Young RH, eds. WHO Classification of Tumours of Female Reproductive Organs. 4th ed. Lyon France: IARC Press; 2014:172-181. World Health Organization Classification of Tumours; vol 6.

(15.) Solomon D, Nayar R, eds. The Bethesda System for Reporting Cervical Cytology: Definitions, Criteria, and Explanatory Notes. 2nd ed. New York, NY: Springer-Verlag; 2004.

(16.) Levi AW, Galullo P, Gordy K, et al. Increasing cytotechnologist workload above 100 slides per day using the BD FocalPoint GS Imaging System negatively affects screening performance. Am J Clin Pathol. 2012; 138(6):811-815.

(17.) Elsheikh TM, Kirkpatrick JL, Cooper MK, Johnson ML, Hawkins AP, Renshaw AA. Increasing cytotechnologist workload above 100 slides per day using the ThinPrep imaging system leads to significant reductions in screening accuracy. Cancer Cytopathol. 2010; 118(2):75-82.

(18.) Renshaw AA, Elsheikh TM. Predicting screening sensitivity from workload in gynecologic cytology: a review. Diagn Cytopathol. 2011; 39(11):832-836.

(19.) Ellis K, Renshaw AA, Dudding N. Individual estimated sensitivity and workload for manual screening of SurePath gynecologic cytology. Diagn Cytopathol. 2012; 40(2):95-97.

(20.) Elsheikh TM, Kirkpatrick JL, Fischer D, Herbert KD, Renshaw AA. Does the time of day or weekday affect screening accuracy?: a pilot correlation study with cytotechnologist workload and abnormal rate detection using the ThinPrep Imaging System. Cancer Cytopathol. 2010; 118(1):41-46.

(21.) US Food and Drug Administration. How laboratories can safely calculate workload for FDA-approved semi-automated gynecologic cytology screening devices. Medical Device-Tips and Articles on Device Safety resources Web page. http://www.fda.gov/MedicalDevices/Safety/AlertsandNotices/ TipsandArticlesonDeviceSafety/ucm220292.htm. Published July 27, 2010. Accessed December 28, 2012.

(22.) US Food and Drug Administration. Summary of safety and effectiveness data. FDA premarket approval for the ThinPrep Imaging System. FDA/CEDR resources Web page. http://www.accessdata.fda.gov/cdrh_docs/pdf2/P020002b. pdf. Published June 6, 2003. Accessed December 18, 2012.

(23.) US Food and Drug Administration. Summary of safety and effectiveness data. FDA premarket approval for the BD FocalPoint GS Imaging System. FDA/ CEDR resources Web page.http://www.accessdata.fda.gov/cdrh_docs/pdf/ P950009S008b.pdf. Published December 3, 2008. Accessed December 18, 2012.

(24.) Department of Health and Human Services, Health Care Financing Administration. Clinical Laboratory Improvement Amendments of 1998--standard: cytology: final rule. Fed Regist. 1992; 57(40):7146. Codified at 42 CFR [section]493.1274(c)(1)(i)(C)(ii).

(25.) Djemli A, Khetani K, Auger M. Rapid prescreening of Papanicolaou smears: a practical and efficient quality control strategy. Cancer. 2006; 108(1):21-26.

(26.) Lee BC, Lam SY, Walker T. Comparison of false negative rates between 100% rapid review and 10% random full rescreening as internal quality control methods in cervical cytology screening. Acta Cytol. 2009; 53(3):271-276.

(27.) Renshaw AA, Auger M, Birdsong G, et al. ASC/SIL ratio for cytotechnologists: a survey of its utility in clinical practice. Diagn Cytopathol. 2010; 38(3): 180-183.

(28.) College of American Pathologists. Master All Common Checklist: CAP Accreditation Program. CAP Reference, Resources and Publications Web page. http://www.cap.org/apps/cap.portal?_nfpb1/4true&OnlineChecklistController_8_ 6_actionOverride1/4%2Fportlets%2FOnlineChecklist%2FprepareChecklist&_ windowLabel1/4OnlineChecklistController_8_6&_pageLabel1/4customChecklists_ page. Published September 25, 2012. Accessed December 31, 2012.

(29.) Department of Health and Human Services, Health Care Financing Administration. Clinical Laboratory Improvement Amendments of 1998--standard: test systems, equipment, instruments, reagents, materials, and supplies: final rule. Fed Regist. 1992; 57(40):7146. Codified at 42 CFR [section]493.1252(b).

(30.) ThinPrep 2000 System [product insert]. Marlborough, MA: Hologic, Inc; 2011.

(31.) SurePath Collection [product insert]. Burlington, NC: TriPath Imaging, Inc; 2011.

(32.) Hardie A, Moore C, Patnick J, et al. High-risk HPV detection in specimens collected in SurePath preservative fluid: comparison of ambient and refrigerated storage. Cytopathology. 2009; 20(4):235-241.

(33.) Dixon EP, Lenz KL, Doobay H, Brown CA, Malinowski DP, Fischer TJ. Recovery of DNA from BD SurePath cytology specimens and compatibility with the Roche AMPLICOR Human Papillomavirus (HPV) test. J Clin Virol. 2010; 48(1): 31-35.

(34.) Castle PE, Solomon D, Hildesheim A, et al. Stability of archived liquid-based cervical cytologic specimens. Cancer. 2003; 99(2):89-96.

(35.) Campos EA, Simoes JA, Rabelo-Santos SH, et al. Recovery of DNA for the detection and genotyping of human papillomavirus from clinical cervical specimens stored for up to 2 years in a universal collection medium with denaturing reagent. J Virol Methods. 2008; 147(2):333-337.

(36.) Heselmeyer-Haddad K, Sommerfeld K, White NM, et al. Genomic amplification of the human telomerase gene (TERC) in pap smears predicts the development of cervical cancer. Am J Pathol. 2005; 166(4):1229-1238.

Barbara A. Crothers, DO; Teresa M. Darragh, MD; Rosemary H. Tambouret, MD; Ritu Nayar, MD; Guliz A. Barkan, MD; Chengquan Zhao, MD; Christine Noga Booth, MD; Vijayalakshmi Padmanabhan, MD; Z. Laura Tabatabai, MD; Rhona J. Souers, MS; Nicole Thomas, MPH, CT(ASCP); David C. Wilbur, MD; Ann T. Moriarty, MD

Accepted for publication April 30, 2015.

Published as an Early Online Release June 5, 201 5.

From the Department of Pathology and Area Laboratory Services, Walter Reed National Military Medical Center, Bethesda, Maryland (DrCrothers); the Pathology Cytology Laboratory, Mount Zion Medical Center, University of California, San Francisco (Dr Darragh); the Department of Pathology, Massachusetts General Hospital, Boston (Drs Tambouret and Wilbur); the Department of Cytopathology, Northwestern University Medical School, Chicago, Illinois (Dr Nayar); the Department of Pathology, Loyola University Medical Center, Maywood, Illinois (Dr Barkan); the Department of Pathology, Magee-Women's Hospital, Pittsburgh, Pennsylvania (Dr Zhao); the Department of Anatomic Pathology, Cleveland Clinic Foundation, Cleveland, Ohio (Dr Booth); the Pathology Department, Dartmouth Hitchcock Medical Center, Lebanon, New Hampshire (Dr Padmanabhan); the Department of Pathology, University of California and the Veteran's Administration Medical Center, San Francisco (Dr Tabatabai); the Department of Biostatistics (Ms Souers) and the Cytopathology Committee (Ms Thomas), College of American Pathologists, Northfield, Illinois; and the Department of Pathology, AmeriPath Indiana, Indianapolis (Dr Moriarty).

The authors have no relevant financial interest in the products or companies described in this article.

The opinions or assertions contained herein are the private views of the authors and do not reflect the official policy of the Department of the Army, the Department of Defense, or the US government.

Reprints: Barbara A. Crothers, DO, Department of Pathology and Area Laboratory Services, Walter Reed National Military Medical Center, 8901 Wisconsin Ave, Bethesda, MD 20889-5600 (e-mail: Barbara.a.crothers.mil@mail.mil).
Table 1. Responses to Reporting Practices in Gynecologic Cytology

Question                                     Response
                                             No. (%)

How do you report Papanicolaou tests with
obvious LSIL and cells suspicious for
HSIL? (a) (n = 576)

  LSIL, cannot exclude HSIL                 466 (80.9)
  LSIL and ASC-H                             56 (9.7)
  HSIL                                       55 (9.5)
  LSIL                                       49 (8.5)
  ASC-H                                      33 (5.7)

Does your laboratory report endocervical
cells/transformation zone sampling?
(n = 591)

  Report presence and absence               445 (75.3)
  Report only absence                       111 (18.8)
  Do not report                              28 (4.7)
  Report only presence                       7 (1.2)

Annual Tests With Absence of Endocervical Cells/Transformation
Zone Sampling

No.   Mean   Minimum   Maximum      10th         25th
                                 Percentile   Percentile

336   15.5      0         92         4            6

No.      50th         75th         90th
      Percentile   Percentile   Percentile

336       13           20           30

Abbreviations: ASC-H, atypical squamous cells, cannot exclude
high-grade squamous intraepithelial lesion;HSIL, high-grade squamous
intraepithelial lesion; LSIL, low-grade squamous intraepithelial
lesion.

(a) Multiple responses were allowed for this question.

Table 2. Responses to Gynecologic Cytology Workload Practices

Question                                            Response, No. (%)

Does your laboratory limit gynecologic cytology
screening to: (n = 573)

  We do not limit screening within the work shift      544 (94.9)

  First part of the work shift                          23 (4.0)

  Second part of the work shift                          6 (1.0)

Do you keep workload records that distinguish
between image-assisted slides and nonimage-
assisted slides? (n = 582)

  We do not use an image-assisted screening            394 (67.7)
    instrument

  Yes                                                  117 (20.1)

  No                                                    71 (12.2)

For image-assisted slides, do you count slides
that require full manual review differently than
those not requiring full manual review? (n = 121)

  Yes                                                  108 (89.3)

  No                                                    13 (10.7)

For calculating workload, what value do you give
image-assisted slides (not requiring full manual
review)? (n = 140)

  0.5 (1/2 slide)                                      104 (74.3)

  1.0                                                   30 (21.4)

  1.5                                                    1 (0.7)

Other                                                    5 (3.6)

When a full manual review is required for an               is
image-assisted gynecologic cytology slide, the
manual review usually done by: (n = 141)

  The same cytotechnologist who performed the FOV      132 (93.6)
    review
  A different cytotechnologist                           4 (2.8)

  A supervisory level cytotechnologist (>3 y             4 (2.8)
    experience)
  A pathologist                                          1 (0.7)

If a full manual review is required for an image-
assisted gynecologic cytology slide, when is it
reviewed?a (n = 158)

  Immediately following the initial FOV with the       122 (77.2)
    imaging system microscope
  Immediately following the FOV with a nonimaging       44 (27.8)
    system microscope
  At a later time with a nonimaging system              26 (16.5)
    microscope
  At a later time on the imaging system                  5 (3.2)
    microscope

Gynecologic Cytology Slides Screened/Hour By Cytotechnologist

                                 10th         25th
No.   Mean   Minimum   Maximum   Percentile   Percentile

53 1   8.9      0         68        5            7

      50th         75th         90th
No.   Percentile   Percentile   Percentile

531       9            10           12

Abbreviation: FOV, field of view.

(a) Multiple responses were allowed for this question.

Table 3. Quality Assurance Practices Related to Imaging Systems

                                   10th         25th
No.   Mean   Minimum   Maximum   Percentile   Percentile

Image-assisted gynecologic cytology slides requiring full
manual review,%

129   33.8      0       100          10           15

ASC/SIL ratio for image-assisted gynecologic cytology slides

133   2.43      0       44.0         1.0          1.2

ASC/SIL ratio for nonimage-assisted gynecologic cytology slides

446   2.67      0       60.0         0.6          1.2

No.      50th         75th         90th
      Percentile   Percentile   Percentile

Image-assisted gynecologic cytology slides requiring full
manual review, %

129       25           40          100

ASC/SIL ratio for image-assisted gynecologic cytology slides

133      1.6          2.0          3.0

ASC/SIL ratio for nonimage-assisted gynecologic cytology slides

446      1.7          2.5          4.1

Question                                             Response,
                                                     No. (%)

For quality assurance purposes, do you
  retrospectively review slides on the imaging
  system to detect the following types of
  errors? (a) (n = 120)
  Imager (cells not in FOV)                          107 (89.2)
  Human locator (in FOV, not marked)                 102 (85.0)
  Human interpretative (marked, not significant)      89 (74.2)
For the CLIA mandated 10% negative review, do you
  include imaged cases that are negative and (a)
  (n = 170)
  Did not need full manual review?                   147 (93.6)
  Required full manual review?                       145 (92.4)
Do you look at retrospective cases with the imager
  to determine if there were abnormal cells in the
  FOV? (n = 170)
  Yes, if applicable                                 100 (58.8)
  No                                                  70 (41.2)

Abbreviations: ASC/SIL, Atypical squamous cells/squamous
intraepithelial lesion;CLIA, Clinical Laboratory Improvement
Amendments of 1988; FOV, field of view.

(a) Multiple responses were allowed for these questions.

Table 4. Responses to Ancillary Testing

Question                                           Response,
                                                    No. (%)

Where is high-risk HPV testing performed?
  (n = 573)
  Sent out to a reference/referral laboratory      321 (56.0)
  Within the institution, not in the cytology
    laboratory                                     168 (29.3)
  In the cytology laboratory                        47 (8.2)
  Not performed                                     37 (6.5)
Other than HPV testing, what additional
  nonmorphologic tests does your laboratory
  perform from a liquid-based cytology vial? (a)
  (n = 511)
  No additional tests offered                      295 (57.7)
  Chlamydia trachomatis                            215 (42.1)
  Neisseria gonorrhea                              202 (39.5)
  Herpes simplex virus                              39 (7.6)
  Bacterial vaginosis                               21 (4.1)
  Cystic fibrosis                                   14 (2.7)
  Hepatitis C virus                                 8 (1.6)
  Human immunodeficiency virus                      7 (1.4)
  Epstein Barr virus                                3 (0.6)
  TERC (gain of 3q) analysis                        1 (0.2)

Abbreviations: HPV, human papillomavirus;TERC, telomerase
ribonucleic acid component.

(a) Multiple responses were allowed for this question.
COPYRIGHT 2016 College of American Pathologists
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2016 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:CAP Laboratory Improvement Programs
Author:Crothers, Barbara A.; Darragh, Teresa M.; Tambouret, Rosemary H.; Nayar, Ritu; Barkan, Guliz A.; Zha
Publication:Archives of Pathology & Laboratory Medicine
Article Type:Survey
Date:Jan 1, 2016
Words:8191
Previous Article:Reappraisal of Serosal Invasion in Patients With T3 Colorectal Cancer by Elastic Stain: Clinicopathologic Study of 139 Surgical Cases With Special...
Next Article:Computational Pathology: A Path Ahead.
Topics:

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters