SEARCH

SEARCH BY CITATION

The Accreditation Council for Graduate Medical Education (ACGME) is responsible for oversight of pathology residency and fellowship organizations in the United States. Residents must have successfully completed their training in an ACGME-accredited or Royal College of Physicians and Surgeons of Canada-accredited residency to be eligible to take the American Board of Pathology (ABP) examinations.[1] For subspecialties like cytopathology, only ACGME-accredited fellowships are accepted. The ACGME is undergoing major restructuring with a shift to continuous accreditation and outcomes assessment, the “Next Accreditation System” (NAS).[2] With the NAS, site visits will be extended to 10 years for those individual pathology and fellowship programs that are performing well. The episodic “biopsy” model for accredited programs will be replaced by annual data collection, and program information forms will be replaced by self-studies. Each ACGME review committee will monitor the program data and trends and will determine whether the program needs to submit additional data or take action. The annual data will include resident and faculty surveys, case log information, scholarly activity, and educational milestones data.

The Milestones project is the keystone of the NAS, requiring discernible developmental stages in attaining proficiency at established times during training. The educational milestones are based on the six ACGME competencies, planned in a systematically progressive framework.[3] Milestones for the pathology resident are now posted in draft form on the ACGME website,[3] and those for cytopathology fellows are in draft form but not published at this time. Each milestone is arranged in numbered levels, with level 1 representing most fellows at the beginning of training, level 4 representing the graduating fellow target, and level 5 being aspirational or representative of a cytopathologist in practice several years. The fellow interpretive and diagnostic knowledge milestones at level 4 will most likely be worded to state that fellows accurately diagnose most cytology samples. The ACGME revised program requirements for cytopathology that are effective July 1, 2013 specifically state that cytopathology fellows must demonstrate diagnostic proficiency and that they must demonstrate competence in the application of additional diagnostic adjuncts, including molecular testing.[4] Fellows must evaluate at least 2000 cytology specimens, including 500 gynecologic specimens; these may include cases interpreted by the fellows in addition to study sets and shared cases. Fellows must also demonstrate competence in performing and immediately assessing fine-needle aspiration specimens, and the requirements specifically recommend that programs monitor the degree of agreement between immediate evaluation and final diagnosis.[4] The ACGME will likely require that fellows maintain and program directors monitor case log data for fine-needle aspiration procedures performed by the fellow. However, additional types of logs, including case assessment and agreement logs, may be useful in evaluating competencies and milestones. Each program will be required to establish a clinical competency committee, which must provide objective assessments of competencies based on the cytopathology fellowship milestones, and to report these data to the ACGME on a semiannual basis.

The criteria proposed by Chebib et al in this issue of Cancer Cytopathology form a solid core set suitable for use as an assessment toolbox in cytopathology education.[5] These criteria are potentially robust indicators of a fellow's proficiency, because they are based on well defined national benchmarks and evidence-based data. Implementation of these objective assessment methods across programs could favor the establishment of a database of program's performance. In their article, the authors discuss the use of gynecologic cytology reporting rates and interobserver variability to assess and monitor cytopathology fellow performance. Specifically, the ratio of atypical squamous cells (ASC) to squamous intraepithelial lesions (SIL) (the ASC/SIL ratio) and the human papillomavirus (HPV)-positive rates in various interpretive categories were calculated and compared with faculty ratios and rates.[5] The data were collected and calculated for 5 consecutive fellows and then compared with faculty data. Although substantial agreement between fellow and faculty interpretations was demonstrated, the agreement was best for those cases interpreted as atypical squamous cells of undetermined significance (ASC-US) by the fellows. It is noteworthy that the ASC/SIL reporting rates tended to be lower for fellows than for faculty, and 33.2% of cases interpreted as negative by the fellow and ASC-US by the faculty were positive for HPV. Other potentially useful indicators of trainees' performance, also highlighted by Chebib et al, are the ratios between negative for intraepithelial lesion or malignancy (NILM) and high risk-HPV (hr-HPV)-positive (NILM/hr-HPV) (expected, 4%-8%) and between low-grade SIL (LSIL) and hr-HPV-positive (LSIL/hr-HPV) (expected, from >80% to 90%).[5]

There are 2 potential sources of bias in the cytology workflow that could impair the objective performance assessment of fellows. One source of bias is the ability to obtain the results of HPV status at the time of rendering an interpretation. The other is the built-in influence of the cytotechnologist's initial evaluation. Efforts should be made to specifically circumvent these potential sources of bias. A likely solution is to use anatomic pathology information systems that allow the documentation of the fellow's interpretation before the case continues with its traditional screening path. In addition, these information systems have the capability to collect and analyze these statistics. The longitudinal case-based evaluation of diagnostic competency model, as described by Ducatman and Ducatman, describes a valuable method with which to circumvent the inevitable biases intrinsic to our specialty.[6] This model's objectives, among others, are to increase systems accountability, decrease differences in practice, and teach residents and faculty to use an evidence-based method of practice-based learning. The effectiveness of this model resides on several rules, such as the resident's review of a significant number of cases, faculty grading of each case consistently and fairly, and stable grading criteria.

Studies like these are critical to the literature, because they provide program directors with ideas of how they can best implement the new ACGME program requirements and assess both resident and fellow milestones through the use of objective methods. Cytopathology fellowship programs may face challenges in collecting appropriate data, because some laboratory computer systems may not currently be collecting fellow interpretations or may not provide a mechanism to quantify data. Some programs may choose to develop manual logs that each fellow can maintain either continuously or at certain time points. We suggest that programs make efforts to collect and quantify some data before the semiannual evaluation of milestones. Such data will provide both fellows and program directors with objective measures on where improvements may be needed. To use the ASC/SIL ratio to monitor fellow performance effectively, the faculty members also need to have relatively stable reporting rates and high interobserver reproducibility. This may be a challenge for those programs with many busy faculty members who have many clinical and teaching responsibilities. Case reviews by both faculty and fellows at a multiheaded microscope are useful. Cytologic-histologic correlation is another method to hone interpretive skills and improve interobserver reproducibility. Finally, blinded review of gynecologic cytology slides through both intralaboratory and interlaboratory programs can help laboratories assess accuracy of screening and interpretative processes.

The ASC/SIL ratio is a measure of the tendency of a cytologist to make an ambiguous diagnosis relative to a diagnosis of certainty.[7] The ASC/SIL ratio can vary according to patient population. An elevated ASC/SIL ratio may indicate a greater readiness to classify minor atypia as ASC-US. However, it also may indicate a reluctance to make a definite interpretation of low-grade or high-grade SIL. In the experience of the authors, novice cytopathologists often have higher ASC/SIL ratios than those with several years' experience. Although there is always a tendency by pathologists to call cases atypical if there is uncertainty, junior pathologists may encounter uncertainty more frequently as they attempt to protect themselves from adverse patient care outcomes and avoid false-negative cytology results. The finding by Chebib et al of lower ASC/SIL ratios for fellows compared with faculty is somewhat surprising.[5] The average fellow ASC/SIL ratio of 1.15 is lower than the average ratio observed in questionnaires compiled by the College of American Pathologists.[8] This is probably because of the bias presented by the cytotechnologist interpretation. Regretfully, adequate data were not collected to enable an assessment of the changes in interpretation from primary screener, to fellow, to faculty pathologist. A low ASC/SIL ratio also could represent the over calling of LSIL or the under calling of ASC-US because of the lack of sign-out responsibility by the fellow. Are the fellows attempting to simulate the practice setting in which they are ultimately responsible for patient care, or are they relying on the faculty pathologist to set the appropriate cutoff of sensitivity for the laboratory?

Some studies have demonstrated a valuable correlation between the ASC/SIL ratio and the hr-HPV—positive/ASC-US rate, allowing for useful advice to individual cytologists on how to improve the quality of cytologic interpretations.[7] These indicators measure unrelated aspects of performance. The hr-HPV–positive/ASC-US rate is an objective measure of the risk of dysplasia. The ASCUS-LSIL Triage Study (ALTs) documented an hr-HPV–positive/ASC-US rate of 50.6% and considered this indicator as a benchmark for the performance assessment of cytology laboratories.[9] The hr-HPV–positive/ASC-US rate for fellows averaged 46.4%, which is similar to the ALTS data but higher than data from many community laboratory settings, which have a larger spread in patient age. The findings in the study by Chebib et al may have been biased by data early in the fellowship year, when fellows lack the diagnostic competence and knowledge to set an optimal negative/atypical threshold. Monitoring the ASC/SIL ratio and hr-HPV–positive/ASC-US rate at different times during the fellowship year may help address this question.

Of greater concern is the possibility that fellows in many programs either are not able to, or choose not to, exercise a graduated responsibility role. Allen recently published a commentary on this topic; we consider this is a “must” article for any residency or fellowship program director.[10] According to core ACGME program requirements, residents and fellows must be granted progressive responsibility and authority.[4] Cytopathology fellows should exercise graduated responsibility, including reaching an independent diagnosis, with appropriate indirect and direct supervision. If fellows do not believe that they are making the tough interpretive calls during their fellowship programs, then they will be hampered in effectively adjusting to independent practice.[10] Program directors should insist that fellows make decisions on cases and query them regarding the impact of their decisions on patient care and safety. Fellows who have a tendency to under call cases compared with faculty either could be dangerous to the public or may not be taking graduated responsibility seriously. Thus, as faculty supervisors, we need to attempt to discern these 2 possibilities and counsel the fellow appropriately. For cervical cytology specimens specifically, the screening interval is now less frequent, and patient safety is an important consideration. HPV cotesting provides an additional safety net when performed in appropriate age groups but will not be applicable to all patients.

High-performing individual residency and fellowship programs will have less frequent site visits under the NAS, but sponsoring institutions will have Clinical Learning Environment Review (CLER) site visits more frequently.[11] The ACGME has established the CLER program as a key component of the NAS with the goal of promoting safety and quality of care by focusing on 6 areas essential to the security and excellence of care that residents will provide throughout a lifetime of practice after the completion of their training.[12] The 6 areas of residents' assessments include patient safety, quality improvement and care transitions, resident supervision, duty hour oversight, fatigue management, and enhancing professionalism. The rationale of CLER is to produce both national metrics on residency and fellowship programs and institutional characteristics that will have a beneficial and valuable effect on quality of care and patient safety, not only within the educational setting but also after graduation.[2] Residents and fellows will need to be integrated into each institution's patient safety programs and quality-improvement projects. Pathologists, especially cytopathologists, have more experience than most specialties in quality improvement. Our fellowship programs can demonstrate achievement of milestones through specific quality-improvement monitors and quality-improvement projects, such as the study by Chebib et al.[5] These efforts will help our laboratories and sponsoring institutions achieve success during the CLER visits.

All educational assessment toolboxes must fulfill what a high-quality evaluation process needs: reliability, validity, ease of use, resources required, ease of interpretation, and educational impact.[13] These assessment tools should be field-test validated to factor in variations in local context and implementation processes, factors that can limit accuracy and comparability. Any assessment toolbox needs to incorporate and define the specific criteria and meaningful outcomes for the achievement of competency at precise, established intervals as fellows progress through their training, following the recommendation of the Milestones project. Specific thresholds or acceptable targets should be clearly demarcated for each indicator and for each to-be-defined developmental step. What are needed are objective outcomes and clearly defined time targets during the fellowship indicating expected objective results to assess educational impact. Local and national assessment benchmarks will need to be generated by the active collaboration of members of the cytology community through their respective national and international associations. An effort should be made to create reproducible data among the various groups of observers and institutions. This diverse information could be pooled and analyzed over time to allow for the establishment of national databases of individuals' and programs' performance. The ultimate goal is to be able to align specific outcomes with objective assessments, incorporating significant feedback to draw valid inferences about the learning process. The outcome of this competency-based training process should generate proficient cytopathologists who are well equipped for independent practice, protecting the health of the public by introducing quality assurance and reducing medical error.[14]

FUNDING SUPPORT

  1. Top of page
  2. FUNDING SUPPORT
  3. CONFLICT OF INTEREST DISCLOSURES
  4. REFERENCES

No specific funding was disclosed.

CONFLICT OF INTEREST DISCLOSURES

  1. Top of page
  2. FUNDING SUPPORT
  3. CONFLICT OF INTEREST DISCLOSURES
  4. REFERENCES

Dianne Davey is an unpaid volunteer member of the Pathology Residency Review Committee for the Accreditation Council for Graduate Medical Education (ACGME).

REFERENCES

  1. Top of page
  2. FUNDING SUPPORT
  3. CONFLICT OF INTEREST DISCLOSURES
  4. REFERENCES
  • 1
    American Board of Pathology. Requirements for subspecialty certification. American Board of Pathology website. Available at: http://www.abpath.org/BofISubspecialtyCert.htm. Accessed May 1, 2013.
  • 2
    Nasca TJ, Philibert I, Brigham T, Flynn TC. The Next GME Accreditation System—rationale and benefits. N Engl J Med. 2012;366:10511056.
  • 3
    Naritoku WY, Alexander CB, Bennett BD, et al. The Pathology Milestone Project, April 2013 Draft. Accreditation Council for Graduate Medical Education Next Accreditation System website. Available at:http://www.acgme-nas.org/milestones.html. Accessed May 1, 2013.
  • 4
    Accreditation Council for Graduate Medical Education (ACGME). ACGME Program Requirements for Graduate Medical Education in Cytopathology, effective July 1, 2013. Available at: http://www.acgme.org/acgmeweb/ProgramandInstitutionalGuidelines/Hospital-BasedAccreditation/Pathology.aspx. Accessed May 1, 2013.
  • 5
    Chebib I, Rao R, Wilbur D, Tambouret R. Using the ASC:SIL ratio, human papillomavirus, and interobserver variability to assess and monitor cytopathology fellow training performance. Cancer Cytopathol. 2013;119:638643.
  • 6
    Ducatman BS, Ducatman AM. Longitudinal case-based evaluation of diagnostic competency among pathology residents: a statistical approach. Arch Pathol Lab Med. 2006;130:188193.
  • 7
    Cibas ES, Zou KH, Crum C, Kuo F. Using the rate of positive high-risk HPV test results for ASC-US together with the ASC-US/SIL ratio in evaluating the performance of cytopathologists. Am J Clin Pathol. 2008;129:97101.
  • 8
    Eversole GM, Moriarty AT, Schwartz MR, et al. Practices of participants in the College of American Pathologists interlaboratory comparison program in cervicovaginal cytology, 2006. Arch Pathol Lab Med. 2010;134:331335.
  • 9
    Solomon D, Schiffman M, Tarone R. Comparison of 3 management strategies for patients with atypical squamous cells of undetermined significance: baseline results from a randomized trial. J Natl Cancer Inst. 2001;93:293299.
  • 10
    Allen TC. Graduated responsibility for pathology residents: no time for half measures. Arch Pathol Lab Med. 2013;137:457460.
  • 11
    Weiss KB, Bagian JP, Nasca TJ. The clinical learning environment: the foundation of graduate medical education. JAMA. 2013;309:16871688.
  • 12
    Weiss KB, Wagner R, Nasca T. Development, testing, and implementation of the ACGME Clinical Learning Environment Review (CLER) program. J Grad Med Educ. 2012;4:396398.
  • 13
    Swing SR, Clyman SG, Holmboe ES, Williams RG. Advancing resident assessment in graduate medical education. J Grad Med Educ. 2009;1:278286.
  • 14
    Brennan TA. Physicians' professional responsibility to improve the quality of care. Acad Med. 2002;77:973980.