Printer Friendly

Pathologic diagnostic correlation with breast imaging findings: a college of American pathologists Q-Probes study of 48 institutions.

Pathologists and radiologists are integral members of the multidisciplinary team necessary for optimal management of patients with breast carcinoma. Because more image-guided biopsies are performed for abnormalities detected by breast imaging, cooperation between radiologists and pathologists in establishing concordance or correlation between radiologic and pathologic findings is essential in the appropriate management of patients. To this end, the radiologists use the Breast Imaging Reporting and Data System (BIRADS) to categorize or classify breast imaging abnormalities and to standardize radiology reports. (1,2) The criteria for performing a radiologic-guided biopsy on a breast lesion are based on the likelihood of a lesion being malignant with suspicious abnormality (BIRADS category 4), and lesions highly suggestive of malignancy (BIRADS category 5) are most often biopsied. However, there continues to be some practice variation in the interpretation of radiologic findings. (3-5) Adding to the practice variability is that breast imaging and biopsy are also performed by nonradiologists (surgeons). Such variability in practice may affect the pathologist's ability to correlate pathologic and radiologic findings.

Correlation/concordance is present when the pathology findings provide an acceptable explanation of breast-imaging features.(1) Discordance or lack of correlation is present when histologic findings do not provide an acceptable or satisfactory explanation for the breast-imaging features. Imaging-pathologic discordance of stereotactic or ultrasound-guided biopsies has been reported to range from approximately 1% to 8% and is slightly greater for magnetic resonance imaging-guided biopsies. (1,6,7) Given that up to 24% of discordant cases may harbor carcinoma, (1) the pathologic findings must adequately explain radiologic impressions to ensure appropriate sampling and to prevent diagnostic delay and treatment. The objectives of this study were to (1) evaluate the rates of radiologic and pathologic correlation in breast needle core biopsies, (2) evaluate laboratory and radiology practices associated with greater correlation rates, and (3) the rates at which the lack of radiologic-pathologic correlation is documented in pathology reports.

DESIGN

The study was offered and conducted as a College of American Pathologists voluntary Q-Probes program, the basic mechanism of which has been previously described. (8) Participants in this study, conducted in the fall of 2010, retrospectively reviewed 30 consecutive, initial, diagnostic needle core biopsy cases performed for abnormal radiologic findings. If 12 months of accessioned cases were reviewed without identifying 30 qualifying cases, participants stopped the retrospective review and included all cases identified. For each case or specimen, the participants provided detailed information about the radiologic and pathologic findings. Specimens with no prebiopsy breast imaging, with tissue specimens processed at another institution, with surgical excision specimens (incisional biopsy, excisional biopsy/ lumpectomy, and mastectomy), with previous breast carcinoma, or with fine-needle aspiration biopsies were excluded from the study. Correlation, in this study, was defined as pathologic findings providing an acceptable and reasonable explanation for the radiologic findings, for example, calcifications identified in pathology for biopsies performed for radiologic calcifications. The participants were asked to determine, based on their judgment, whether the pathologic finding of each case provided an acceptable explanation for the radiologic finding. We also independently evaluated correlation based on specific reported radiologic and pathologic findings. In that evaluation, we expanded our definition of correlation to include cases with in situ or invasive ductal lesions without calcifications for biopsies performed either for radiologic calcifications or for masses. The rate of radiologic-pathologic correlation of breast needle core biopsies was determined based on (1) participating pathologists' judgments, and (2) reported radiologic and pathologic findings, judged independently. We also tested these rates for associations with institutional demographic practice variables and aggregate biopsy case characteristics.

Statistical Analysis

Individual associations between the 2 correlation rates and the demographic and practice variables were investigated with Kruskal-Wallis tests for discrete-valued, independent variables and with regression analysis for the continuous independent variables.

Variables with associations (P < .10) were then included in a forward-selection, multivariate-regression model. A P value of < .05 was considered significant. In addition, the case-level radiologic-pathologic correlation was tested for association with case-specific characteristics by the Pearson [chi square] test.

All analyses were performed with SAS 9.2 statistical software (SAS Institute Inc, Cary, North Carolina).

RESULTS

Participants from 48 institutions submitted data for this study. In aggregate, a radiologic-pathologic correlation was found in 94.9% (1328 of 1399) of the cases reviewed based on the participants' judgment. There were no significant differences in the correlation rates of the following: whether surgeons or radiologists performed the biopsy, whether cores with calcifications were identified in any way, and whether radiologic reports and images were reviewed before verification of pathology reports. However, there were significant differences in the correlation rates when cases were discussed at interdepartmental, multidisciplinary conferences (P < .001) and by biopsy types (P = .03; see Table 1).

Twenty-one percent (292 of 1391) of the breast needle core biopsies were performed by the surgeons. The BIRADS score was not provided in 59% of the cases; however, when provided, 79.5% (449 of 565) and 12.9% (73 of 565) of the biopsies were performed for BIRADS 4 and 5, respectively. Cores with calcifications were identified in some way in 66% (386 of 585) of the cases by the radiologists (Table 2). Table 3 highlights the demographics of participating institutions.

Of the 47 participants answering, 28 (59.6%) reported having the same turnaround time (TAT) for breast and nonbreast needle biopsy specimens; 54.3% (25 of 46) of the participants have an expected TAT of 2 working days for needle core breast biopsy, whereas 43.5% (20 of 46) of all institutions had an expected TAT of 1 working day. In these institutions, 74.5% (35 of 47) did not have a designated breast pathologist, 89.4% (42 of 47 ) had an interdepartmental or multidisciplinary breast conference, and 34.8% (16 of 46) indicated they had a formal mechanism or system in place to ensure radiologicpathologic correlation in the pathology department. For cases performed for radiologic calcifications with no calcifications identified on histology, 36.2% (17 of 47) of the participants x-ray the paraffin blocks and only cut additional sections if calcifications are present on the specimen x-ray images. Table 4 highlights other laboratory practices related to needle core breast biopsy cases in the participating institution.

The correlation rates are not significantly different based on whether the institutions use digital mammography, film-based mammography, or both. Similarly, there was no significant difference in correlation rates by whether the institution had one or more designated breast pathologists. Having a formal system in place in the pathology department to ensure correlation does not appear to be associated with increased correlation; about two-thirds of the participants did not have such a system in place (see Table 5). For cases determined to have no correlation, the pathology reports indicated the lack of correlation in 53.6% (37 of 69) of the instances and indicated additional steps taken to evaluate the specimen in 53.7% (36 of 67) of the cases (Table 2). The radiologic-pathologic correlation rates based on participating pathologists' judgments and the correlation rates judged independently based on reported radiologic and pathologic findings had a similar level of significance (Tables 1 and 5). Table 6 highlights distribution of surgical pathology cases and staffing volumes.

COMMENT

The study evaluated the rate of radiologic-pathologic correlation of specific radiologic lesions (calcifications, masses, or distortions) and observed a high correlation rate between radiologic and pathologic findings overall. Some practices that might be expected to increase correlation rates, such as separating cores with calcifications, reviewing radiologic reports or images before signing out cases, and having a designated breast pathologist(s), were not significantly associated with greater correlation rates in this study. One practice that was significantly associated with a greater correlation rate was having a breast biopsy multidisciplinary conference. Although not all breast biopsy cases can be discussed at multidisciplinary or interdepartmental conferences, the presence of such a conference may alert the pathologist to the need for correlation because of the possibility that the case might be discussed.

Pathologists generally do not need extensive knowledge of breast radiologic imaging to accurately interpret breast biopsies; some knowledge of radiologic imaging may help in correlating the pathologic findings with the radiologic findings. In formal interdepartmental or multidisciplinary conferences, the pathologist is exposed to the radiologic findings, the management options, and the sampling difficulties encountered by the radiologists. These pathologists gain experience and knowledge that helps ensure pathologic correlation with radiographic findings.

Being reasonably proficient in the interpretation of breast lesions may not be synonymous with having the designation of breast pathologist. Only a few institutions (25.5%; 12 of 47) in our study reported having a designated breast pathologist. Although that may be due to the participants' practice characteristics, we did not observe a significant difference in the rate of correlation between institutions with or without a designated breast pathologist. Some investigators believe that expert breast pathology assessment is necessary for adequate management of breast carcinoma, (9,10) but others observed no statistical difference in the discrimination scores or the degree of inconsistency between general pathologists and pathologists with this special interest. (11) Many hospitals have a policy to routinely review outside pathology materials before treatment of patients in their institutions (12,13) to ensure diagnostic accuracy and complete pathology reports. We must, however, emphasize that our study was not designed to evaluate pathologic diagnostic accuracy or differences in the diagnostic opinions between designated breast pathologists and nondesignated breast pathologists, nor was it designed to evaluate intradepartmental or extradepartmental consultation practices for borderline or difficult breast needle core biopsy cases. However, possible reasons for the lack of association between having a designated breast pathologist and the correlation rates may include (1) nondesignated breast pathologists being reasonably proficient in interpreting breast lesions, (2) pathologists with interests in breast pathology not having the designation of a breast pathologist, (3) internal consultations among pathologists taking place before a case is verified (as an internal quality assurance measure), and (4) extradepartmental consultations with an expert being sought for difficult cases. Azam and Nakhleh (14) observed that breast specimens were among the specimens most commonly sent for extradepartmental expert consultation. Nakhleh and colleagues (15) also demonstrated that breast biopsies were reviewed internally before sign-out at a greater frequency than other specimens.

The lack of significant difference in the correlation rate observed for needle core biopsies performed by surgeons or by radiologists, although unexpected, is not particularly surprising. The American College of Radiology requires certain minimum qualifications be met by radiologists and nonradiologists (eg, surgeons) who perform stereotactic breast biopsies or ultrasound-guided breast biopsies. (16,17) Therefore, any qualified physician performing imageguided breast biopsies should be certified and reasonably proficient.

Although some believe radiologic-pathologic correlation rates for needle core biopsies performed for calcifications are greatly enhanced by separating cores with calcifications from those without calcifications, (18) others believe the practice is not particularly useful to pathologists. (19) Separating cores with calcifications appears to be a common practice because 66% (385 of 585) of the cases in this study were reported to have cores with calcifications identified in some way by the radiologist. However, the correlation rates showed no association with whether cores with calcifications were separated. This study was not designed to assess whether this practice improves diagnostic accuracy or reduces time spent examining specimens. Therefore, we cannot comment specifically on the utility of this practice. Nevertheless, whether cores with calcifications are separated from those without calcifications, the pathologist is expected to carefully examine all the cores and specimen. This may explain the observed lack of difference in correlation with this practice.

The practice of reviewing radiologic reports and images is not common among pathologists. When available, the participants indicated that the radiologic images and reports were actually reviewed in 22.3% (313 of 1401) and 47.9% (669 of 1398) of the cases respectively before verification of pathology reports. Although reviewing the radiologic images or reports is a good practice, there does not appear to be an indication in this study that the practice significantly improves the correlation rates. This may further highlight that a vast knowledge of breast radiology is not necessary to accurately interpret breast core biopsies.

Similarly, no significant difference was found in the correlation rate for digital versus film-based mammography. One may expect digital imaging to be better, given that digital mammography is reportedly easier to manipulate for further evaluation, thereby reducing the follow-up procedures and repeat images needed. (20) It is easier to share digital images for consultation. (20) However, there are divergent views on which method has better cancer detection rates. (21-23) Furthermore, some investigators noted that digital mammography may lead to overtreatment because of greater cancer detection rates (24) and that digital mammograms may actually take longer to read (25) than film-based mammograms. Our study did not evaluate the accuracy rates of cancer detection between digital and film-based mammography. Nevertheless, an accurate interpretation of the breast images is largely dependent on the expertise and experience of the radiologists or surgeons interpreting the images, regardless of the type of mammography used.

Most of the laboratories (65.2%; 30 of 46) have no formal system in place to ensure radiologic-pathologic correlation in the pathology department, but when such a mechanism is in place (34.8%; 16 of 46 institutions), the most frequent mechanism used is to discuss the incongruent case with the radiologist (56.3%; 9 of 16) responsible for the biopsy (Table 4). Surprisingly, there appears to be no significant difference in correlation rates depending on whether such a system is in place. It is not clear why a lack of a formal system for correlation does not affect the correlation rate; perhaps, it is because radiologists perform the correlation because they are required by the Mammography Quality Standards Act (26) to have some formal system in place. Currently, there is no regulatory requirement in place for pathologists to have a system in place, and the awareness of the existence of such a system in radiology may be influencing pathologist behavior, regarding active participation in radiologic-pathologic correlation. However, not all needle core breast biopsies are being performed by the radiologists, not all breast biopsy pathology reports get to the radiologist(s) for correlation with findings on imaging, and perhaps not all radiologists perform radiologic-pathologic correlation. Therefore, it is a good practice for pathologists to evaluate breast biopsy specimens for correlation with the stated breast-imaging findings. Of note, for cases determined to have no correlation by the pathologists, the pathologist indicated that lack of correlation in 53.6% (37 of 69) of the reports, and indicated additional steps taken to evaluate the specimen in 53.7% (36 of 67) of the cases (Table 2). We consider documenting a lack of correlation and additional steps taken to ensure correlation in the pathology report a good practice. Such documentation may also be useful for risk management if there is a significant delay in diagnosing a malignancy because of inadequate radiologic sampling of the lesion.

This study has some inherent limitation because the data are self-reported. We also did not ask for follow-up information on the cases to evaluate whether follow-up breast imaging or surgical excision confirmed the participants' judgment of correlation for all cases. As such, we are unable to determine the false-negative and false-positive correlation. We attempted to mitigate this by evaluating the correlation rates based on specific pathologic findings that may correlate with specific radiologic findings (Table 7); however, lack of follow-up information limits the conclusion that may be drawn from the data. The sample size may be another limitation of this study. Lastly, most of the data (76.1%; 35 of 46) came from laboratories in institutions with 300 or fewer beds, which may skew that data somewhat toward smaller institutions. In spite of these apparent limitations, we believe that the data provide a glimpse into the practice patterns of pathologists from multiple institutions regarding radiologic-pathologic correlations in breast biopsies.

References

(1.) Liberman L, Menell J. Breast imaging reporting and data system (BI-RADS). Radiol Clin North Am. 2002; 40(3):409-430.

(2.) Lazarus E, Mainiero MB, Koelliker SL, Livingston LS. BI-RADS lexicon for US and mammography: interobserver variability and positive predictive value. Radiology. 2006; 239(2):385-391.

(3.) Lehman CD, Rutter CM, Eby PR, White E, Buist DS, Taplin SH. Lesion and patient characteristics associated with malignancy after a probably benign finding on community practice mammography. AJR Am J Roentgenol 2008; 190(2): 511-515.

(4.) Lee HJ, Kim EK, Kim SM, et al. Observer variability of breast imaging reporting and data system (BI-RADS) for breast ultrasound. Eur J Radiol. 2008; 65(2):293-298.

(5.) Kerlikowske K, Grady D, Barclay J, et al. Variability and accuracy in mammographic interpretation using the American College of Radiology breast imaging reporting and data system. J Natl Cancer Inst. 1998; 90(23):1801-1809.

(6.) Lee J-M, Kaplan JB, Murray MP, et al. Imaging-histologic discordance at MRI-guided 9-gauge vacuum-assisted breast biopsy. AJR Am J Roentgenol. 2007; 189(4):852-859.

(7.) Philpotts LE, Hooley RJ, Lee CH. Comparison of automated versus vacuum-assisted biopsy methods for sonographically guided core biopsy of the breast. AJR Am J Roentgenol. 2003; 180(2):347-351.

(8.) Bacher P, Howanitz P. Using Q-Probes to improve the quality of laboratory medicine: a quality improvement program of the College of American Pathologists. Qual Assur Health Care. 1991; 3(3):167-177.

(9.) Rakovitch E, Mihai A, Pignol J-P, et al. Is expert pathology assessment necessary for the management of ductal carcinoma in situ? Breast Cancer Res Treat. 2004; 87(3):265-272.

(10.) Staradub VL, Messenger KA, Hao N, Wiley EL, Morrow M. Changes in breast cancer therapy because of pathology second opinions. Ann Surg Oncol. 2002; 9(10):982-987.

(11.) Parham DM, Anderson N, Buley I, et al. Experts and performance in histopathology--a study in breast pathology. Pathol Res Pract. 2010; 206(11):749-752.

(12.) Kronz JD, Westra WH, Epstein JI. Mandatory second opinion surgical pathology at a large referral center. Cancer. 1999; 86(11):2426-2435.

(13.) Manion E, Cohen MB, Weydert J. Mandatory second opinion in surgical pathology referral material: clinical consequences of major disagreements. Am J Surg Pathol. 2008; 32(5):732-737.

(14.) Azam M, Nakhleh RE. Surgical pathology extradepartmental consultation practices: a College of American Pathologists Q-Probes study of 2746 consultations from 180 laboratories. Arch Pathol Lab Med. 2002; 126(4):405-412.

(15.) Nakhleh RE, Bekeris LG, Souers RJ, Meier FA, Tworek JA. Surgical pathology case reviews before sign-out: a College of American pathologists Q Probes study of 45 laboratories. Arch Pathol Lab Med, 2010; 134(5):740-743.

(16.) American College of Radiology. Stereotactic breast biopsy accreditation program requirements. http://www.acr.org/accreditation/stereotactic/stereotactic_ breast_reqs.aspx. Accessed June 15, 2011.

(17.) American College of Radiology. Ultrasound accreditation program requirements. http://www.acr.org/accreditation/breast/breast_ultrasound_reqs. aspx. Accessed June 15, 2011.

(18.) Margolin FR, Kaufman L, Jacobs RP, Denny SR, Schrumpf JD. Stereotactic core breast biopsy of malignant calcifications: diagnostic yield of cores with and withoutcalcifications on specimen radiographs. Radiology. 2004; 233(1):251-254.

(19.) Easley S, Abdul-Karim FW, Klein N, Wang N. Segregation of radiographic calcifications in stereotactic corebiopsies of breast: is itnecessary. BreastJ. 2007; 13(5):486-489.

(20.) National Cancer Institute. National Cancer Institute factsheet: mammograms. http://www.cancer.gov/cancertopics/factshee^Detection/mammograms#r1. Accessed June 15, 2011.

(21.) Pisano ED, Gastonis C, Hendrick E, et al. Diagnostic performance of digital versus film mammography for breast-cancer screening. N Engl J Med. 2005; 353(17):1773-1783.

(22.) Skaane P, Skjennald A. Screen-film mammography versus full-field digital mammography with soft-copy reading: randomized trial in a population-based screening program--the Oslow II study. Radiology. 2004; 232(1):197-204.

(23.) Skaane P, Hofvind S, Skjennald A. Randomized trial of screen-film versus full-field digital mammography with soft-copy reading in population-based screening program: follow-up and final results of Oslo II study. Radiology. 2007; 244(3):708-717.

(24.) Feeley L, Kiernan D, Mooney T, et al. Digital mammography in a screening programme and its implication for pathology: a comparative study. JClin Pathol. 2011; 64(3):215-219.

(25.) Haygood TM, Wang J, Atkinson EN, et al. Timed efficiency of interpretation of digital and film-screen screening mammograms. AJR Am J Roentgenol. 2009; 192(1):216-220.

(26.) US Department of Health and Human Services. Food and Drug Administration. Radiation-emitting products: Mammography Quality Standards Act regulations. http://www.fda.gov/Radiation-EmittingProducts/Mammography QualityStandardsActandProgram/Regulations/ucm110906.htm. Accessed June 15, 2011. 42 USC 263b. Fed Regist. 1997; 62(208):55852-55994. To be codified at 21 CFR 116 and 1900.

Michael O. Idowu, MD, MPH; Linday Bonner Hardy, MD; Rhona J. Souers, MS; Raouf E. Nakhleh, MD

Accepted for publication July 8, 2011.

From the Department of Pathology, Virginia Commonwealth University Health System, Richmond (Dr Idowu); the Department of Pathology, Beth Israel Deaconess Medical Center, Cambridge, Massachusetts (Dr Hardy); the Department of Statistics/Biostatistics, College of American Pathologists, Northfield, Illinois (Ms Souers); and the Department of Pathology, Mayo Clinic, Jacksonville, Florida (Dr Nakhleh).

The authors have no relevant financial interest in the products or companies described in this article.

Reprints: Michael O. Idowu, MD, MPH, Department of Pathology, Virginia Commonwealth University Health System, 6th Floor Gateway/ MCV Campus, 1200 E Marshall St, PO Box 980662, Richmond, VA 23298 (e-mail: midowu@vcu.edu).
Table 1. Analysis of Aggregate Correlation Rates by Biopsy Case
Characteristics

Classification Correlation Rate Based on
 Individual Judgment

 Cases, Cases P Value
 No. (%) Correlated,
 No. (%)

Who performed the biopsy? n = 1391 n = 1320 (94.9) .57
 Radiologist 1099 (79.0) 1041 (94.7)
 Surgeon 292 (21) 279 (95.5)
Biopsy type n = 1178 n = 1121 (95.2) .03
 Stereotactic 613 (52.0) 593 (96.7)
 Ultrasound-guided 546 (46.3) 510 (93.4)
 MRI-guided 19 (1.6) 18 (94.7)
Main radiologic finding n = 1311 n = 1248 (95.2) .03
 Mass, not otherwise 477 (36.4) 442 (92.7)
 specified
 Calcifications, not
 otherwise specified 301 (23.0) 295 (98.0)
 Calcifications with
 a specific pattern 163 (12.4) 160 (98.2)
 and/or distribution
 Mass with smooth contours 106 (8.1) 100 (94.3)
 Spiculated mass 106 (8.1) 100 (94.3)
 New calcifications 63 (4.8) 60 (95.2)
 Calcifications
 associated with a mass 59 (4.5) 56 (94.9)
 Distortion 36 (2.7) 35 (97.2)
If the main radiologic
 finding was
 calcifications,
 were cores with
 calcifications
 identified in n = 583 n = 567 (97.3) .44
 any way?
 Yes 384 (65.9) 372 (96.9)
 No 199 (34.1) 195 (98.0)
Radiologic images reviewed
 before pathology report
 verification n = 1396 n = 1324 (94.8) .59
 Yes 311 (22.3) 297 (95.5)
 No 1085 (77.7) 1027 (94.7)
Radiologic reports reviewed
 before pathology report
 verification n = 1393 n = 1323 (94.8) .40
 Yes 665 (47.7) 635 (95.5)
 No 728 (52.3) 688 (94.5)
Case discussed at
 interdepartmental,
 multidisciplinary breast
 conference n = 963 n = 916 (95.1) <.001
 Yes 357 (37.1) 351 (98.3)
 No 606 (62.9) 565 (93.2)

Classification Correlation Rate Based on
 Specific Findings

 Cases, No. (%) Cases P
 Correlated, Value
 No. (%)

Who performed the biopsy? n = 1272 (100) n = 1043 (82.0) .63
 Radiologist 1009 (79.3) 830 (82.3)
 Surgeon 263 (20.7) 213 (81.0)
Biopsy type n = 1096 (100) n = 908 (82.8) <.001
 Stereotactic 591 (53.9) 518 (87.6)
 Ultrasound-guided 493 (45.0) 386 (78.3)
 MRI-guided 12 (1.1) 4 (33.3)
Main radiologic finding n = 1278 (100) n = 1049 (82.1) <.001
 Mass, not otherwise 462 (36.2) 336 (72.7)
 specified
 Calcifications, not
 otherwise specified 295 (23.1) 268 (90.8)
 Calcifications with
 a specific pattern 159 (12.4) 151 (95.0)
 and/or distribution
 Mass with smooth contours 103 (8.1) 75 (72.8)
 Spiculated mass 102 (8.0) 94 (92.2)
 New calcifications 65 (5.1) 59 (90.8)
 Calcifications
 associated with a mass 57 (4.5) 44 (77.2)
 Distortion 35 (2.7) 22 (62.9)
If the main radiologic
 finding was
 calcifications,
 were cores with
 calcifications
 identified in n = 562 n = 509 (90.6) .76
 any way?
 Yes 371 (66.0) 335 (90.3)
 No 191 (34.0) 174 (91.1)
Radiologic images reviewed
 before pathology report
 verification n = 1275 n = 1046 (82.0) .98
 Yes 300 (23.5) 246 (82.0)
 No 975 (76.5) 800 (82.1)
Radiologic reports reviewed
 before pathology report
 verification n = 1273 n = 1044 (82.0) .15
 Yes 639 (50.2) 534 (83.6)
 No 634 (49.8) 510 (80.4)
Case discussed at
 interdepartmental,
 multidisciplinary breast
 conference n = 893 n = 730 (81.7) <.001
 Yes 334 (37.4) 304 (91.0)
 No 559 (62.6) 426 (76.2)

Abbreviation: MRI, magnetic resonance imaging.

Table 2. Radiologic Findings

Classification No. (%)

Clinical history includes
radiologic finding, n = 1401

 Yes 1313 (93.7)
 No 88 (6.3)

Clinical history readily available
electronically, n = 1398

 Yes 1123 (80.3)
 No 275 (19.7)

Breast imaging films available
for review, n = 1401

 Yes 1029 (73.4)
 No 372 (26.6)

Radiologic image(s) reviewed before
pathology report verification, n = 1401

 Yes 313 (22.3)
 No 1088 (77.7)

Radiologic report(s) reviewed before
pathology report verification, n = 1398

 Yes 669 (47.9)
 No 729 (52.1)

BIRADS score, (a) n = 565

 0 13 (2.3)
 1 4 (0.7)
 2 8 (1.4)
 3 18 (3.2)
 4 449 (79.5)
 5 73 (12.9)

Biopsy performed by a radiologist
or surgeon, n = 1395

 Radiologist 1102 (79.0)
 Surgeon 293 (21.0)

Biopsy type, (b) n = 1181

 Stereotactic 614 (52.0)
 Ultrasound-guided 548 (46.4)
 MRI-guided 19 (1.6)

Main radiologic finding, n = 1401

 Mass, not otherwise specified 478 (34.1)
 Calcifications not otherwise specified 301 (21.5)
 Calcifications with a specific 163 (11.6)
 pattern and/or distribution
 Mass with smooth contours 106 (7.6)
 Spiculated mass 106 (7.6)
 New calcifications 65 (4.6)
 Calcifications associated with a mass 60 (4.3)
 Option not listed 57 (4.1)
 Distortion 36 (2.6)
 Not provided 29 (2.1)

If the main radiologic finding was
calcifications, were the cores with
calcifications identified in any
way? n = 585

 Yes 386 (66.0)
 No 199 (34.0)

For cases determined to have no
correlation, pathology report
indicated there was no correlation,
n = 69

 Yes 37 (53.6)
 No 32 (46.4)

For findings without correlation,
the pathology report indicated what
additional steps were taken in
evaluating the specimen, n = 67

 Yes 36 (53.7)
 No 31 (46.3)

Abbreviations: BIRADS, Breast Imaging Reporting
and Data System; MRI, magnetic resonance imaging.

(a) No BIRADS score was provided in 806 cases.

(b) For 206 cases, biopsy type was not provided.

Table 3. Institution Demographics

 Institutions,
Classification No. (%)

Institution type, n = 48
 Voluntary, nonprofit hospital 34 (70.8)
 Private, independent laboratory 5 (10.4)
 Proprietary hospital 2 (4.2)
 County hospital 2 (4.2)
 Governmental, nonfederal university hospital 2 (4.2)
 Group practice 1 (2.1)
 Nongovernmental, university hospital 1 (2.1)
 State acute hospital 1 (2.1)
Occupied bed size, n = 46
 0-150 20 (43.5)
 151-300 15 (32.6)
 301-450 5 (10.9)
 451-600 5 (10.9)
 >600 1 (2.2)
Institution location, n = 48
 City 19 (39.6)
 Suburban 15 (31.3)
 Rural 13 (27.1)
 Other 1 (2.1)
Government affiliation, n = 48
 Nongovernmental 43 (89.6)
 Governmental, nonfederal 5 (10.4)

Table 4. Laboratory Practices Related to Needle Core Breast
Biopsy Cases

Classification No. (%)

Institution has one or more designated
breast pathologist(s), n = 47

 Yes 12 (25.5)
 No 35 (74.5)

How does the needle core breast biopsy
specimen TAT expectation compare to TAT
expectations for other nonbreast biopsy
specimens? n = 47

 Same TAT expectation for needle core 28 (59.6)
 breast biopsy specimens and other
 nonbreast biopsy specimens

 Shorter TAT expectation for needle core 17 (36.2)
 breast biopsy specimens

 Longer TAT expectation for needle core 1 (2.1)
 breast biopsy specimens

There is no expected TAT 1 (2.1)

Expected TAT for needle core breast
biopsy specimens, n = 46

 Within 1 working d 20 (43.5)
 Within 2 working d 25 (54.3)
 Within 3 working d 1 (2.2)

How are cases with calcifications on
breast radiologic imaging but no
calcifications identified on histology
initially handled? n = 47

 X-ray the paraffin tissue block(s) and 17 (36.2)
 then cut additional sections if
 calcifications are present on the
 specimen x-ray images

 Cut deeper sections until calcifications 13 (27.7)
 are identified without tissue block(s)
 x-ray

 Level through the tissue block(s) 9 (19.1)
 without tissue block x-ray

 Cut a specific number of deeper levels 2 (4.3)
 and verify case if there is still no
 calcification

 Report the case with no 2 (4.3)
 additional workup

 Other 4 (8.5)

How are cases with mass/distortion on
radiologic imaging with no histology
correlation initially handled? n = 47

 Cut a specific number of deeper levels; 15 (31.9)
 verify case if there is still no
 correlation

 Discuss case with the radiologist 11 (23.4)

 Obtain an intradepartmental consultation 7 (14.9)
 with another pathologist on the
 diagnosis

 Verify the case with no additional 7 (14.9)
 work-up

 Include a comment in the report about 5 (10.6)
 noncongruency

 Other 2 (4.3)

Institution has interdepartmental breast
conference (eg, tumor board, radiologic
pathology conference), n = 47

 Yes 42 (89.4)
 No 5 (10.6)

There is a formal mechanism-system in
place to ensure radiologic-pathologic
correlation in the pathology department,
n = 46

 Yes 16 (34.8)
 No 30 (65.2)

Most frequent mechanism in place to
ensure radiologic-pathologic
correlation, n = 16

 Incongruent cases are discussed with the 9 (56.3)
 radiologist responsible for the biopsy

 Incongruent cases are discussed at a 3 (18.8)
 correlation conference

 All cases are discussed at a correlation 2 (12.5)
 conference

 Excision or rebiopsy is recommended for 1 (6.3)
 all incongruent cases

 Random cases are discussed at a 1 (6.3)
 correlation conference

Abbreviation: TAT, turnaround time.

Table 5. Distributions of Correlation Rates (Individual
Judgment and Specific Findings) by Laboratory Practices

 All Institutions
 Percentiles

Classification No. 10th 50th 90th

Institution has 1 or more
 designated breast
 pathologists, n = 47
 Judgment (P = .54)
 Yes 12 83.3 95.0 100.0
 No 35 86.7 96.7 100.0
 Specific pathologic findings
 (P = .20), n = 47
 Yes 12 71.4 80.4 89.7
 No 35 66.7 85.7 96.6
Institution has digital or
 film-based mammography, n = 45
 Judgment (P = .66)
 Digital mammography 25 86.7 96.7 100.0
 Film-based mammography 2 93.3 95.0 96.7
 Both digital and film-based 18 83.3 98.4 100.0
 Specific pathologic findings
 (P = .45)
 Digital mammography 25 67.9 83.3 96.2
 Film-based mammography 2 68.8 75.2 81.5
 Both digital and film-based 18 70.0 85.7 96.6
Formal mechanism/system in place
 to ensure radiologic-
 pathologic correlation
 in the pathology department,
 n = 46
 Judgment (P = .17)
 Yes 16 86.7 100.0 100.0
 No 30 85.0 96.7 100.0
 Specific pathologic findings
 (P = .27)
 Yes 16 65.5 89.5 96.7
 No 30 68.4 83.1 92.9

Table 6. Distributions of Surgical Pathology Cases and Staffing
Volumes

 All Institutions
 Percentiles
Classification Institutions
 Answering, 10th 25th
 No.

No. of surgical 47 5273 7686
 pathology cases
 accessioned in 2009
No. of breast needle 47 74 121
 core biopsies
 performed in 2009
No. of breast specimens 47 227 331
 accessioned in 2009
No. of pathologists who 47 2 3
 sign out surgical
 pathology cases
No. of pathologists who 47 2 3
 sign out breast cases

Classification All Institutions Percentiles

 50th: 75th 90th
 Median

No. of surgical 11 500 21 363 42 348
 pathology cases
 accessioned in 2009
No. of breast needle 226 500 899
 core biopsies
 performed in 2009
No. of breast specimens 599 1321 2553
 accessioned in 2009
No. of pathologists who 5 7 16
 sign out surgical
 pathology cases
No. of pathologists who 4 6 9
 sign out breast cases

Table 7. Correlation Between Pathologic and Breast Imaging
Findings Independently Based on Reported Radiologic and
Pathologic Findings

 Main Pathologic Finding Codes
Main Radiologic Finding (That Correlate With the
 Radiologic Findings)

Calcifications, not 21, 23, 24, 28, 30
otherwise specified (calcifications must be
 identified unless there
 is cancer), 32, 34

New calcifications 21, 23, 24, 28, 30
 (calcifications must be
 identified unless there
 is cancer), 32, 34

Calcifications with a specific 21, 23, 24, 28, 30
pattern and/or distribution (calcifications must be
(eg, punctuate, pleomorphic, identified unless there
linear, segmental, linear is cancer), 32, 34
with linear distribution) ("linear calcifications
 with linear
 distribution" is often
 DCIS)

Calcifications associated 23, 32, 34, 37, 39, 40, 41
with a mass

Mass, not otherwise specified 34, 36, 37, 38, 39, 40,
 41, 42, 43, 44

Spiculated mass 33 (fat necrosis may
 rarely present as a
 spiculated mass), 34,
 37, 40, 41

Mass with smooth contours 34, 38, 39, 40, 41, 42,
 43, 44

Distortion 34, 37, 38, 39, 40, 41

Key: Main Pathologic Finding Codes

21 Benign breast tissue (including
 columnar cell change without
 atypia) with calcification
 (calcium oxalate/phosphate)

23 Fibroadenoma/fibroadenomatoid
 change(s); dense stromal fibrosis
 with calcifications

24 Columnar cell hyperplasia with
 atypia or FEA with calcifications

28 ADH/DCIS with calcifications

30 ALH/LCIS with calcifications

32 Invasive carcinoma

33 Other: benign

34 Other: malignant

36 Dense stromal fibrosis, PASH

37 Radial scar/complex sclerosing
 lesions

38 Fibroepithelial lesion
 (fibroadenoma, phyllodes)

39 Papillary lesions

40 In situ carcinoma (DCIS, LCIS)

41 Invasive carcinoma

42 Lymphoma

43 Sarcoma

44 Metastatic malignancy

Abbreviations: ADH, atypical ductal hyperplasia; ALH, atypical
lobular hyperplasia; DCIS, ductal carcinoma in situ; FEA, flat
epithelial atypia; LCIS, lobular carcinoma in situ; PASH,
pseudoangiomatous stromal hyperplasia.
COPYRIGHT 2012 College of American Pathologists
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2012 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:CAP Laboratory Improvement Programs
Author:Idowu, Michael O.; Hardy, Linday Bonner; Souers, Rhona J.; Nakhleh, Raouf E.
Publication:Archives of Pathology & Laboratory Medicine
Article Type:Report
Date:Jan 1, 2012
Words:5607
Previous Article:Test verification and validation for molecular diagnostic assays.
Next Article:Validation of KRAS testing for anti-EGFR therapeutic decisions for patients with metastatic colorectal carcinoma.
Topics:

Terms of use | Privacy policy | Copyright © 2022 Farlex, Inc. | Feedback | For webmasters |