Address correspondence to Lisa G. Criscione-Schreiber, MD, Box 3490 DUMC, Durham, NC 27710. E-mail: email@example.com.
American Council on Graduate Medical Education program requirements mandate that rheumatology training programs have written goals, objectives, and performance evaluations for each learning activity. Since learning activities are similar across rheumatology programs, we aimed to create competency-based goals and objectives (CBGO) and evaluations that would be generalizable nationally.
Through an established collaboration of the 4 training programs' directors in North Carolina and South Carolina, we collaboratively composed CBGO and evaluations for each learning activity for rheumatology training programs. CBGO and linked evaluations were written using appropriate verbs based on Bloom's taxonomy. Draft documents were peer reviewed by faculty at the 4 institutions and by members of the American College of Rheumatology (ACR) Clinician Scholar Educator Group.
We completed templates of CBGO for core and elective rotations and conferences. Templates detail progressive fellow performance improvement appropriate to educational level. Specific CBGO are mirrored in learning activity evaluations. Templates are easily modified to fit individual program attributes, have been successfully implemented by our 4 programs, and have proven their value in 4 residency review committee reviews.
We propose adoption of these template CBGO by the ACR, with access available to all rheumatology training programs. Evaluation forms that exactly reflect stated objectives ensure that trainees are assessed using standardized measures and that trainees are aware of the learning expectations. The objectives mirrored in the evaluations closely align with the proposed milestones for internal medicine training, and will therefore be a useful starting point for creating these milestones in rheumatology.
Competency-based goals and objectives (CBGO) that “document progressive fellow performance improvement appropriate to educational level” for each educational activity are required by the American Council on Graduate Medical Education (ACGME) since initiation of the Outcome Project. Further, programs must “provide objective assessments of competence in patient care, medical knowledge, practice-based learning and improvement, interpersonal and communication skills, professionalism, and systems-based practice” , the 6 domains of clinical competency introduced in 1999.
The American College of Rheumatology (ACR) Committee on Training and Workforce developed a competency-based curriculum in 2006, which remains posted online . This useful curriculum outlines specific knowledge areas expected to be achieved to gain competency in rheumatology by graduation from a 2-year ACGME accredited program. The current ACR curriculum was useful and informative for phase 1 of the Outcome Project, in which competencies were initially defined. However, the current ACR curriculum is not broken down by learning activity, and is not easily modifiable by programs to meet the requirement of being able to document increased capabilities met as trainees complete required and elective learning activities. Phase 2 of the Outcome Project mandates robust evaluation systems to ensure that competencies are being met. The requirements to meet phases 3 and 4 of the Outcome Project truly shift the focus through the reporting of aggregate program outcomes, which requires documentation that certain competencies, or milestones, are achieved by each resident before program graduation.
Since every rheumatology training program shares the common goal of training physicians to become competent to practice rheumatology independently and without supervision, a common set of goals and objectives with linked evaluations will be a helpful addition to current educational resources. There is a precedent for this practice in other fields, with several published common curricula, goals, and objectives [3-9]. Some of these curricula are limited to particular fields or practices within a general residency [3, 4]. The Academic Pediatric Association has been a leader in creating collaborative educational guidelines for pediatric residency training .
In 2004, the Carolinas Fellows Collaborative (CFC) was formed through the efforts of rheumatology program directors from Duke University, the Medical University of South Carolina, the University of North Carolina at Chapel Hill, and Wake Forest University. The collaboration was initially started to centralize provision of an intensive introductory rheumatology education experience for all of the rheumatology fellows in the 2 states in 2004, and has grown to include 2 educational conferences annually since the 2005–2006 academic year: the summer introductory course and a topic-focused, more advanced winter educational meeting. Given our established collaboration, when the 4 programs were approaching ACGME residency review committee (RRC) reviews, we undertook to collaboratively author CBGO with linked evaluations for each educational activity in the rheumatology training programs. Our rationale for initiating this project was that learning activities are generally similar across rheumatology training programs. We believed that producing goals and objectives that were comprehensive, peer reviewed, and modifiable would benefit all national rheumatology training programs.
Significance & Innovations
Competency-based goals and objectives are an American Council on Graduate Medical Education requirement for every accredited rheumatology training program. These goals and objectives will be shared and can be used by every accredited rheumatology program to improve education.
These documents have been in use for 3 academic years at 4 institutions and are easily modifiable to meet individual program needs.
Collaboratively writing these goals and objectives was an innovative project of the Carolinas Fellows Collaborative that fills a national educational need.
These goals and objectives will form a useful framework for writing rheumatology milestones, which will take place over the next year.
MATERIALS AND METHODS
At the time we initiated this project in 2009, the authors had a collective 34 years of experience in a program director or assistant program director role (range 2–14 years). All of the authors had prior experience in developing rheumatology curricular goals and objectives on the local and/or national levels for learners at the medical student, internal medicine resident, and/or rheumatology fellow levels. The diversity of our individual experiences and self-reported strengths, as well as unique institutional clinical rotations, contributed to the creation of documents that span a wide breadth of potential learning activities in a rheumatology training program.
Composing goals and objectives
The composition of CBGO was accomplished in 3 separate retreats. Prior to the first of 3 face-to-face meetings, by e-mail we devised a list of common activities for which we would write goals and objectives. Each of the 4 authors arrived at the first meeting having independently composed goals and objectives for 3 or 4 unique learning activities without the use of any templates aside from the 6 core competencies (patient care, medical knowledge, practice-based learning and improvement, interpersonal and communication skills, professionalism, and systems-based practice) and other available references for writing curricula [2, 10]. Our method most resembled a modified Delphi method , since we discussed ideas and arrived at internal consensus for each goal or objective that was written. We began with the most ubiquitous and important activity, the rheumatology continuity clinic. Projecting the prewritten document onto a large screen as a starting point, we collaboratively determined a format and discussed and edited the goals and objectives in real time, using action verbs based on Bloom's original taxonomy  and revisions  to classify levels of intellectual behavior important in learning. Three separate meetings were required to complete CBGO for all learning activities. At the first retreat, we decided that evaluations would use the exact goals and objectives for each learning activity. To further emphasize their importance, the learning objectives were written as evaluable knowledge, skills, and behaviors. Rather than the traditional 9-point ACGME grading scale, we decided on a 3-level scale describing whether the learner met the stated objectives (met an insufficient number, met a sufficient number, or met all or nearly all) for an individual learning activity. We chose this scale for simplicity, agreeing that the core purpose of the rating scale is to assess whether the trainee met competency or not. This 3-point scale answers the competency question while leaving room to recognize truly outstanding performance. Using this scale, the evaluation instrument for each learning activity asks for a score for each of the 6 core competencies, and the identification of those learning objectives requiring more attention by the learner.
We compiled the goals and objectives into a single comprehensive document, which was first reviewed by faculty at the 4 institutions. Once revisions were made, the summary document was submitted to 6 members of the Clinician Scholar Educator Group of the ACR, who responded to our request to the group seeking volunteers for outside peer review. The reviewers were rheumatology educators outside of North Carolina and South Carolina, 4 of whom were also either program directors or associate program directors of their respective fellowship training programs.
Implementation and continuous quality improvement
Following edits based on comments from the external peer reviewers, our 4 programs implemented the full set of revised documents for the academic year 2009–2010. Due to expanding curricular offerings, additional learning activity documents, each with unique CBGO and linked evaluations, were created. These additional documents were discussed, edited, and agreed upon by all 4 authors by e-mail without additional outside review beyond ad hoc review from faculty at the 4 institutions.
During 3 face-to-face meetings, including 2 working retreats, followed by work during the implementation phase, we collaboratively composed goals and objectives and their linked evaluations for a total of 23 learning activities (the complete goals and objectives document is found in the Appendix, available online at http://rheumatology.medicine.duke.edu/files/documents/Competency_Based_Goals_and_Objectives_for_Rheumatology_Training_Programs.pdf) (Table 1). Each document begins with a rotation director, location, level of supervision (direct or indirect, or oversight ), and overarching goals for the activity during years 1 and 2 (if relevant). Goals and objectives documents are organized by competency, and include specific objectives under each of the 6 ACGME core competencies (Table 2 and Figure 1). Each learning activity has a linked evaluation (Figure 2). Additionally, we included a table at the end of each CBGO document listing how the objectives of that competency are met, whether through supervised clinical experience, didactics, self-directed learning, or demonstrations. Consistent with its intention, we used the table provided by the previously published ACR curriculum and customized it for each learning activity; programs that use these CBGO can include and modify this table locally . Finally, each document includes a summary of possible tools used for assessment in that learning activity, which builds upon tools listed in the ACR curriculum  and the ACGME/American Board of Medical Specialties Toolbox of Assessment Methods; programs may modify this listing as appropriate for their institution .
Table 1. Learning activities for which goals and objectives were written at the initial 2 CFC retreatsa
Table 2. Learning objectives over the 2-year training perioda
aFor objectives that change over time with progressive experience, wording can be easily changed using principles of Bloom's taxonomy for levels of intellectual behavior. To show progression, the 2-year training period is broken into four 6-month segments, with differences in brackets. For all learning objectives, these post–learning activity evaluations are only one of many ways to assess mastery of objectives.
[Read about] the biochemical mechanisms of action, conditions in which they may be useful, and the potential side effects of the pharmacologic agents used to treat outpatient musculoskeletal and rheumatic diseases
[Describe] the biochemical mechanisms of action, conditions in which they may be useful, and the potential side effects of the pharmacologic agents used to treat outpatient musculoskeletal and rheumatic diseases
[Illustrate the similarities and differences between] the biochemical mechanisms of action, conditions in which they may be useful, and the potential side effects of the pharmacologic agents used to treat outpatient musculoskeletal and rheumatic diseases
[Teach] the biochemical mechanisms of action, conditions in which they may be useful, and the potential side effects of the pharmacologic agents used to treat outpatient musculoskeletal and rheumatic diseases
The draft set of goals and objectives documents and accompanying evaluation forms were reviewed by 6 members of the ACR Clinician Scholar Educator Group. A prominent concern cited was the need to modify the goals and objectives for core rotations to demonstrate progressive responsibility over time, which our initial documents did not do. Therefore, to better meet the ACGME requirement that CBGO document progressive achievement of competence, for any learning activity spanning 2 years (e.g., rheumatology continuity clinic), the document was modified to reflect more advanced objectives as the fellow progresses through training. After this modification, documents were circulated, reviewed, and edited as above.
The peer review provided suggestions of additional activities to add, increasing the specificity of some goals and objectives (e.g., make objectives of the private practice clinic learning activity more distinct from those in the adult outpatient continuity clinic), along with a few specific objectives that had been left out and were added.
The 4 CFC rheumatology programs have now used these CBGO documents for 3 academic years. We have each locally modified goals, objectives, and evaluations to fit individual training programs. We added a line to indicate supervision for each activity (modified locally), which clarifies progressive responsibility. The summary goals and objectives have now been used in successful ACGME reaccreditation reviews of each of the 4 programs, having received an ACGME letter of commendation in one review. The evaluations are equally simple to revise, since they precisely reflect the learning objectives under each competency; therefore, fellows are rated in each competency as a whole, and then according to whether specific objectives are met. Over 3 years, our consensus is that these CBGO and linked evaluations, used as formative evaluation tools, have enhanced fellow education by providing fellows and faculty very clear performance expectations and feedback. The clear guidelines on which to assess fellow performance led to more meaningful feedback on opportunities for trainee growth. Over the course of a 2-year training period, progress in meeting individual objectives, which are modified to reflect increased cognitive skills over time, can be monitored using these evaluations. For example, for the 2-year longitudinal rheumatology outpatient continuity clinic, individual goals can be modified by training period (Table 2) to require demonstration of skill mastery over time. If evaluations become more challenging at 6-month intervals, a fellow may “exceed expectations” initially by performing comprehensive rheumatologic physical examinations, but if 6 months later that fellow routinely requires assistance to detect subtle synovitis, the fellow would no longer be meeting performance expectations. For trainees who do not meet expectations, remedial activities can be identified to ensure that all of the objectives are met over time.
Even among our 4 institutions, implementation of these CBGO and linked evaluations has varied. For example, 2 institutions use E*Value for managing program data, including evaluations, 1 uses MedHub, and 1 uses New Innovations. Therefore, the appearance of the CBGO and evaluations varies by institution, as does the method for communicating the CBGO to trainees and faculty, within the rules that both trainees and faculty must receive a copy of the goals and objectives for each learning activity. Each institution has been able to easily modify the CBGO and evaluations to fit local needs and capabilities. Faculty education regarding the content of the CBGO documents and the fellow evaluation process also varies by institution, but generally occurs through local faculty development initiatives led by the program director.
Most previously published compilations of goals and objectives were composed by national consensus committees. To our knowledge, the goals and objectives we created collaboratively mark the first attempt by a small group of program directors to independently create goals and objectives and evaluations that are modifiable and available to all training programs in a subspecialty training field.
We propose these goals and objectives for use by any rheumatology fellowship training program to promote competency-based rheumatology training. Rheumatology training program directors can easily modify these documents to reflect progressive expectations in meeting the 6 general competencies in ways that reflect the individual training environment. Additionally, we have provided a template for converting these CBGO into linked evaluations to be used as end of rotation performance evaluations. These documents have been peer reviewed and have additionally met the requirements for reaccreditation of each of the 4 CFC programs by the ACGME RRC.
There is a precedent in other fields for national databases of goals and objectives to be made available for modification and use by individual programs [4-7]. Although many of these summary documents were composed by national specialty and subspecialty education committees, several published goals and objectives have a narrow focus, such as cardiothoracic radiology within a diagnostic radiology residency , and some originate in individual institutions or groups of institutions [8, 9].
The overall goal of the ACGME Outcome Project launched in 1999 was to improve how residents deliver quality medical care. Phase 1 of the project introduced the 6 general competency domains. Phase 2 of the project focused on how programs evaluate their residents. Currently in phase 3, the focus has shifted to using aggregate data on performance to improve programs [14, 15]. The ACR curriculum was developed and published by the ACR Committee on Training and Workforce in 2006, near the end of phase 2 , to help programs meet the ACGME requirements. Despite its widespread adoption by US rheumatology fellowship programs, this curriculum does not provide CBGO at the level of individual learning activities.
The institutions of the CFC have successfully implemented these detailed learning activity goals and objectives and evaluations since July 2009. We have individually modified the individual rotation documents to fit each institution's program. Progressive competency over time can be described in these documents by modifying specific objectives using verbs from Bloom's taxonomy to reflect progressive levels of intellectual behavior. These progressive expectations can then easily be transferred into evaluation forms that ask evaluators to document whether each trainee has met, exceeded, or not yet met each individual objective.
Although these CBGO and linked evaluations have clarified expectations and improved our ability to identify trainees early when they are not meeting expectations, many opportunities still remain. First, these CBGO are not inclusive of all of the disease-specific care and knowledge competencies as outlined well in the 2006 ACR core curriculum. Such documentation would make unwieldy end of rotation evaluations, and such detailed knowledge can and should be assessed using other evaluative tools, such as the in-training examination administered annually by the National Board of Medical Examiners, procedure logs, 360-degree evaluations, rheumatology objective structured clinical examinations, and other assessment tools, as appropriate and as determined by conditions at individual institutions. Second, for our rotation evaluations, the supervising physician determines whether the trainee has completely met the objectives because these CBGO are not yet in “milestone” format. A goal of the milestones project will be to make the objectives, or milestones, very specifically defined such that it will be simple to determine whether specific milestones have been achieved, i.e., observing the same trainee behavior, faculty from different institutions should have enough guidance to agree on whether a milestone has been achieved. Additionally, the provided evaluations are formative, are completed at the conclusion of a learning activity, and represent only one of the ways in which trainees are evaluated. By their nature, learning activity performance evaluations are subjective. Nonetheless, use of the 3-point rating system has allowed our training programs to help faculty be clear about whether trainees have met expectations in each competency for each learning activity. The 3-point scale has made it easier to identify trainees who are not meeting expectations, giving an opportunity to delve deeper into how and why expectations have not been met, then if necessary, create remediation plans to address unmet competencies. At this time, performance expectations for each learning activity are set by individual programs, but an assessment of “not meeting expectations” on any element of an evaluation form indicates an opportunity for discussion between the trainee, the involved faculty member, and the training program director to address how to tailor fellows' personal educational development to ensure that objectives are satisfactorily met. Identification of superior performance on this 3-point scale, although not as important as differentiating between meeting and not meeting expectations, helps guide training program directors when completing American Board of Internal Medicine (ABIM) evaluations at the end of each year, which are presented on the traditional 9-point scale. Ultimately, these evaluation forms are modifiable such that decisions on the rating scale can easily be made at an institutional level when implementing these CBGO and evaluations. Finally, the question remains regarding how to test whether these CBGO and evaluations ultimately lead to our trainees providing better clinical care. Ultimately, clear objectives and evaluations may help guide us down a path toward being able to measure patient outcomes.
Our detailed goals and objectives may serve as a foundation for the creation of developmental milestones within the new ACGME accreditation system (next accreditation system) . The ACGME Outcome Project is now moving to assessing outcomes, with phase 3 being use of aggregate performance data for curriculum reform. Aggregate performance data involve the determination and reporting on whether our trainees achieve competence. Through these evaluations, we have been able to easily compile aggregate data regarding how many trainees have been assessed as achieving competence in the many learning activities offered within our training programs. Trainee achievement of competence is best determined through robust, multifaceted evaluations that include assessment of whether specific objectives were met. Such objectives, when written with specific behavioral anchors, can also be termed milestones. The ABIM with the ACGME developed milestones for training in internal medicine. The initial framework and process description were published in 2009 , and were posted for public comment on the ABIM web site in 2012 . Of interest, the proposed internal medicine milestones are divided into several content and practice areas much like the 6 competencies, with proposed intervals of training at which each milestone is expected to be met. Careful construction and implementation of training milestones in rheumatology will help programs assess whether all trainees meet the required milestones to achieve competence for independent unsupervised rheumatology practice, and will be the same across all programs. On detailed review of the proposed ABIM milestones, many of the objectives we composed for these CBGO documents are, with slightly different wording but similar intent, included in the ABIM-proposed milestones. Therefore, the next step for our project, after implementation of the ABIM milestones, will be to further modify these objectives by national consensus into well-defined milestones for evaluation purposes. Using the CBGO for all rheumatology learning activities included here, we are well positioned to further modify the ABIM milestones to compose our own rheumatology training milestones for use throughout phase 4 of the ACGME Outcome Project.
All authors were involved in drafting the article or revising it critically for important intellectual content, and all authors approved the final version to be published. Dr. Criscione-Schreiber had full access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.
Study conception and design. Criscione-Schreiber, Bolster, Jonas, O'Rourke.
Acquisition of data. Criscione-Schreiber, Bolster, Jonas, O'Rourke.
Analysis and interpretation of data. Criscione-Schreiber, Bolster, Jonas, O'Rourke.
ROLE OF THE STUDY SPONSOR
Abbott Laboratories had no role in the study design, data collection, data analysis, writing of the manuscript, or approval of the content of the submitted manuscript. Publication of this article was not contingent on the approval of Abbott Laboratories.
The authors acknowledge and thank the Clinician Scholar Educator Group of the ACR for their review and thoughtful comments on the initial draft of the CBGO. Members of the Clinician Scholar Educator Group include Drs. Michael Battistone, Winn Chatham, Chris Collins, Paulette Hahn, Sharon Kolasinski, and Deana Lazaro.