ACADEMIC EMERGENCY MEDICINE 2011; 18:S110–S120 © 2011 by the Society for Academic Emergency Medicine
Objectives: The development of robust Accreditation Council for Graduate Medical Education (ACGME) systems-based practice (SBP) training and validated evaluation tools has been generally challenging for emergency medicine (EM) residency programs. The purpose of this paper is to report the results of a consensus workgroup session of the 2010 Council of Emergency Medicine Residency Directors (CORD) Academic Assembly with the following objectives: 1) to discuss current and preferred local and regional methods for teaching and assessing SBP and 2) to develop consensus within the CORD community using the modified Delphi method with respect to EM-specific SBP domains and link these domains to specific SBP educational and evaluative methods.
Methods: Consensus was developed using a modified Delphi method. Previously described taxonomy generation methodology was used to create a SBP taxonomy of EM domain-specific knowledge, skills, and attitudes (KSA). The steps in the process consisted of: 1) an 11-question preconference survey, 2) a vetting process conducted at the 2010 CORD Academic Assembly, and 3) the development and ranking of domain-specific SBP educational activities and evaluation criteria for the specialty of EM.
Results: Rank-order lists were created for preferred SBP education and evaluation methods. Expert modeling, informal small group discussion, and formal small group activities were considered to be the optimal methods to teach SBP. Kruskal-Wallis testing revealed that these top three items were rated significantly higher than self-directed learning projects and lectures (p = 0.0317). Post hoc test via permutation testing revealed that the difference was significant between expert modeling and formal small group activity (adjusted p = 0.028), indicating that expert modeling was rated significantly higher than formal small group activity. Direct observation methods were the preferred methods for evaluation. Multiple barriers to training and evaluation were elucidated. We developed a consensus taxonomy of domains that were felt to be most essential and reflective of the practice of EM: multitasking, disposition, and patient safety. Learning formats linked to the domains were created and specific examples of local best practices collected. Domain-specific anchors of observable actions for the three domains were created.
Conclusions: This consensus process resulted in the development of a taxonomy of EM-specific domains for teaching and observable tasks for evaluating SBP. The concept of SBP is interlinked with the other general competencies and difficult to separate. Rather than develop specific SBP evaluation tools to measure the competency directly, SBP competency evaluation should be considered one element of a coordinated effort to teach and evaluate the six ACGME general competencies.
In the 1980s, the United States Department of Education mandated greater use of outcomes in the accreditation process in health professions, education, and business.1 Because the U.S. system of medical education depends heavily on public funding, medical educators have a responsibility to demonstrate competency-based educational outcomes as evidence of their “responsible stewardship in preparing competent physicians to meet the health care needs of the public that supports their efforts.”1
The general competencies were identified by the Accreditation Council for Graduate Medical Education (ACGME) via thorough research and collaboration between January 1998 and February 1999.1 The six general competencies defined the knowledge, skills, and attitudes (KSA) considered to be essential for all graduates of GME programs to obtain prior to entering clinical practice.
The ACGME recognized the critical need for direct attention to navigating the system itself as a key component to successful and effective patient care and defined the sixth general competency of systems-based practice (SBP) as “manifested by actions that demonstrate an awareness of and responsiveness to the larger context and system of health care and the ability to effectively call on system resources to provide care that is of optimal value.”2 Residents are expected to work effectively in various health care delivery settings and systems relevant to their clinical specialty, coordinate patient care within the health care system relevant to their clinical specialty, incorporate considerations of cost awareness and risk–benefit analysis in patient and/or population-based care as appropriate, advocate for quality patient care and optimal patient care systems, work in interprofessional teams to enhance patient safety and improve patient care quality, and participate in identifying system errors and implementing potential systems solutions.3
In 2002, emergency medicine (EM) defined each of the six general competencies more specifically for the training and practice environments that are unique to the specialty. The EM-specific domains of SBP defined in 2002 were cost-appropriate care, resources, delivery systems, and patient advocacy.4 Additional work published in the EM literature has further defined methods for measuring, instructing, and quantitatively evaluating the EM-specific SBP competency,5–11 as well as the general competencies as a whole.12–14
A recent systematic review of the literature evaluating whether the competencies can be measured in a valid and reliable way concluded that SBP and practice-based learning and improvement “are viewed by many authors as representing aspects of health systems and teams rather than those of particular individuals. Thus, it is possible that environmental variables may exert significant influence on the behaviors of trainees surrounding these competences.”15 Their main conclusions were that the attention paid to the general competencies “… has already led to some of their intended benefits.” However, they noted limitations in that “. . . the literature to date has not yielded any method that can assess the ACGME general competencies as independent constructs” and concluded that, while the role of the general competencies is to guide assessment strategies, it is clear that they exist in a realm outside of measurement.15
Other specialties have attempted to address the limitations of faculty time by using multiple tools to evaluate the same competency, such as direct observation and faculty global evaluation (Dreyfus-based anchored scale), chart audits for system errors and recommendations for system improvements, identification of system errors or improvements and publication of results in Web-based format, point-of-care testing education, portfolio documentation of self-reflection, and system-based repairs, as appropriate.16–22
With this background in mind, we convened a session at the 2010 Council of Emergency Medicine Residency Directors (CORD) Academic Assembly with the following objectives: 1) discuss current and preferred methods for teaching and assessing SBP, 2) develop consensus within the CORD community using the modified Delphi method with respect to EM-specific SBP domains, and 3) identify observable behaviors within each SBP domain.
Consensus was developed using a modified Delphi method.23 The modified Delphi technique has been used as a method for establishing consensus on priorities for an organization. Because CORD is an organization representing many individual residency programs that are required to address standard program requirements determined by the ACGME Residency Review Committee for Emergency Medicine, this method was chosen to establish commonalities with respect to the SBP competency. The steps involved in the process include identification of needs, collection of rankings of relative importance of the identified needs, calculation of rank of identified needs using an importance/consensus method, feedback to the group of rankings, and development of tools or actions with the assimilated information.23 The objective of creating a SBP taxonomy of EM domain-specific KSA was modeled after the methodology described by Graham et al. in 2009.24 The process consisted of three phases (Figure 1).
Phase 1: Preconference Survey
An 11-question preconference survey (see Data Supplement S1, available as supporting information in the online version of this paper) was sent out to members of the CORD e-mail list with questions designed to define current practices with respect to SBP education and evaluation. Multiple reminders were sent out to the list requesting programs to submit responses. The respondents were provided a ranking scale for each question, with some of the questions allowing for open-ended responses. The responses were collected and summarized via SurveyMonkey (http://www.surveymonkey.com). In the classic modified Delphi method, face-to-face individual or small group interviews are traditionally used for information gathering. However, we felt that a survey format would reach a broader section of the CORD membership and provide a more representative set of responses.
Phase 2: Vetting Process
A vetting process was conducted during a discussion forum at the 2010 CORD Academic Assembly. Participants in the discussion forum were self-selected attendees. Survey responses were appraised for relevancy and discussed by the audience, and additional responses to the survey questions were obtained. The participants were asked to provide specific examples of SBP activities that they considered their most effectively implemented training methods. All of the examples, both from the group discussion and from the survey, were grouped by the most representative learning format (lecture, small group activity, “expert” modeling, self-directed/facilitator-guided learning projects).
We next focused on developing specific evaluation criteria that the group felt best represented SBP-related competencies. The participants were asked to provide examples of KSA behaviors or tasks that are observable and could differentiate levels of proficiency. From this discussion, an inventory of possible evaluation criteria was generated. The responses were then grouped into behavioral themes or domains, as has been previously described.24
Phase 3: EM-specific SBP Taxonomy Development
The development and vetting of domain-specific educational methods and domain-specific evaluation criteria were reviewed at the conference follow-up session. The development of the SBP taxonomy was conducted as follows: 1) The group reviewed the learning formats and the domains that were generated on the previous day. They then prioritized the domains they felt were most reflective of achievement of competency in SBP for EM residents. 2) Once these domains were vetted and prioritized by the group, the authors linked the domains back to the educational learning formats based on which domain each learning format addressed. 3) From all of these inputs, we defined our consensus EM-specific definitions of SBP and propose new domain-based educational and evaluation criteria.
SBP Survey Results
Sixty of a possible 135 programs responded to the survey (44%). Response rates to survey questions varied between 68 and 100%. Respondent demographics are detailed in Table 1. There was a representative mixture of 3- and 4-year programs, as well as combined programs. University- and community-based programs were most represented. Most programs had either 30 to 39 (n = 14) or 40 to 49 (n = 12) residents. Most programs had one program director; one program had two. Most programs had two associate/assistant program directors and one program coordinator.
|Unspecified “training” site||1|
|Number of residents in program|
|Fewer than 10||2|
|Number of program directors per program|
|Number of assistant program directors|
|Number of program coordinators|
Table 2 lists the rank order of preferences for SBP education and evaluation (questions 1 and 2). These questions were based on a five-point Likert scale. For question 1, a score of “5” represented the most often used and “1” the least often used educational method in their program. For question 2, a score of “5” represented the theoretically most optimal, and “1” the least optimal educational method. The responses are ordered based on average score for each teaching method. Kruskal-Wallis testing revealed that the top three items (expert modeling, informal small group discussion, and formal small group activity) were rated significantly higher than self-directed learning projects and lectures (p = 0.0317). Post-hoc test via permutation testing revealed that the difference was significant between expert modeling and formal small group activity (adjusted p = 0.028), indicating that expert modeling was rated significantly higher than formal small group activity. Expert modeling was felt to be the most effective method for teaching SBP and the most commonly cited method currently used by respondents. Lectures were felt to be the least useful and were the least commonly used.
|Question 1: What methods do you currently use to teach SBP in your residency program?||Rank order (average rating score):|
|1. Expert modeling (3.7)|
|2. Informal small group discussion (3.6)|
|3. Formal small group activity (3.3)|
|4. Self-directed learning projects (3.1)|
|5. Lecture (3.1)|
|Question 2: In your experience, which educational modalities teach SBP most optimally?||Rank order (average rating score):|
|1. Expert modeling (4.0)|
|2. Informal small group discussion (3.4)|
|3. Self-directed learning projects (3.3)|
|4. Formal small group activity (3.2)|
|5. Lecture (2.1)|
Table 3 lists the rank order of responses to current and preferred methods for SBP evaluation. 360-degree evaluations were the most commonly cited annual assessment method. Procedure, operative, or case logs were the most cited quarterly assessment method. Simulations and models were the most common monthly assessment method. Program directors responded that the methods they would ideally like to use for SBP evaluation were standardized oral examinations annually; simulations and models and/or procedure, operative, or case logs quarterly; and simulations and models monthly. We created this rank order to guide the discussion, not to determine statistical difference between methods, as this was beyond the scope of the discussion.
|Question 3: Which methods do you currently use to evaluate SBP in your residency program and with what frequency?||Rank order (percentage response):|
|1. 360-degree evaluation (52%)|
|2. Standardized oral examination (48%)|
|3. Checklist evaluation of live or recorded performance (i.e., SDOT) and portfolio (28%)|
|4. Written examination (MCQ) (26%)|
|5. Record review (18%)|
|1. Procedure, operative, or case logs (36%)|
|2. Simulations and models (32%)|
|3. Global rating of live or recorded performance (28%)|
|4. Record review (27%)|
|5. Standardized oral examination (20%)|
|1. Simulations and models (26%)|
|2. Global rating of live or recorded performance (26%)|
|3. Procedure, operative, or case logs (24%)|
|4. Patient surveys (17%)|
|5. Checklist evaluation of live or recorded performance (i.e. SDOT) and written examination (MCQ) (11%)|
|Assessment methods not currently used:|
|1. Standardized patient examination (80%)|
|2. Chart-stimulated recall oral examination (77%)|
|3. Objective structured clinical examination (76%)|
|4. Patient surveys (55%)|
|5. Written examination (MCQ) (52%)|
|6. Portfolios (44%)|
|7. Record review (36%)|
|8. Global rating of live or recorded performance (38%)|
|9. Standardized oral examination and checklist evaluation of live or recorded performance (i.e., SDOT) (26%)|
|10. Simulations and models (26%)|
|11. 360-degree evaluations (22%)|
|12. Procedure, operative, or case logs (18%)|
|Question 4: Which methods would you ideally like to use to evaluate SBP in your residency program and with what frequency?||Rank order:|
|1. Standardized oral examination (43%)|
|2. Written examination (MCQ) (36%)|
|3. Standardized patient examination (33%)|
|4. Objective structured clinical examination (32%)|
|5. Global rating of live or recorded performance (26%)|
|1. Simulations and models, and procedure, operative, or case logs (43%)|
|2. 360-degree evaluations (42%)|
|3. Record review (41%)|
|4. Portfolios (39%)|
|5. Patient surveys (33%)|
|6. Checklist evaluation of live or recorded performance (i.e., SDOT) (29%)|
|1. Simulations and models (26%)|
|2. Patient surveys (24%)|
|3. Checklist evaluation of live or recorded performance (i.e. SDOT) (22%)|
|4. 360-degree evaluations (20%)|
|5. Global rating of live or recorded performance (17%)|
|Assessment methods that respondents would not use:|
|1. Chart-stimulated recall oral examination (56%)|
|2. Standardized patient examination (52%)|
|3. Objective structured clinical examination (44%)|
|4. Written examination (MCQ) (43%)|
|5. Portfolios (29%)|
|6. Record review (29%)|
|7. Global rating of live or recorded performance (24%)|
|8. Standardized oral examination (21%)|
|9. Patient surveys and procedure, operative, or case logs (19%)|
|10. Checklist evaluation of live or recorded performance (i.e. SDOT) and 360-degree evaluations (15%)|
|11. Simulations and models (12%)|
With respect to open-ended responses, additional educational modalities currently being used for SBP education included the Vanderbilt Healthcare Matrix, combined department conferences focused on a problematic topic (e.g., “Coumadin rechecks after antibiotics started in the ED”), and monthly performance improvement meetings. One program responded that it had “no formalized curriculum.”
Open-ended responses to evaluation methods included biannual EM1 resident assessments of EM3 residents, biannual EM3 resident assessments of EM1 residents, and use of the “Vanderbilt Matrix with all follow-up conferences that are placed in the residents’ portfolios.”
For the open-ended questions on barriers to teaching and evaluating SBP, many factors were cited. The most commonly cited barrier to education included lack of conceptual understanding of SBP and how to teach it. Representative responses included “It means different things to different staff,”“It’s too broad a topic area,” and “Intangible category for many.” Lack of time and faculty were prominently noted barriers as well. Many felt that for the education to be effective and timely, significantly more faculty time and participation would be required.
There was no clear survey consensus as to whether most faculty understand the concepts representing the current definition of SBP for EM. Over 80% of respondents felt that the concepts of resources, providers, delivery systems, cost-appropriate care, and patient advocacy were easy to understand, but fewer felt that they were easy to teach (45%–69%). Even fewer felt that these concepts were easy to evaluate (30%–45%).
Consensus Workshop Results
Approximately 75 participants, primarily program directors and associate/assistant program directors, participated in the consensus workshop. The survey results were reviewed and there was general agreement with the ranking of responses listed in Tables 2 and 3.
Example educational activities relevant to the practice of EM were obtained from the workshop participants and categorized under the five most frequently cited educational learning formats: expert modeling, formal small group activity, formal lectures, self-directed learning projects, and facilitator-guided rotations. Contrary to the survey results, in which lecture format was rank ordered lowest in current use, lectures were frequently used as a method for teaching SBP by the workshop participants (Table 4).
|1. “Expert” modeling (MT, D, PS)|
|• Real-time role modeling of behavior by faculty and senior residents|
|• Sign-out rounds participation|
|• Intern apprenticeship model of institutional-specific SBP training by senior residents|
|• Case manager shadowing|
|• Translator shadowing|
|2. Formal small group activity (MT, D, PS)|
|• Multidisciplinary team simulation (e.g. TeamSTEPPS®)|
|• “SBP MEETING”—group problem solving within an organization, collaborative practice|
|• Homeless clinic participation|
|• Detox clinic participation|
|• Discussion, simulation, oral boards, etc.|
|3. Formal lectures (D, PS)|
|• Morbidity and mortality conference|
|• Monthly didactics from other key stakeholders in patient care: continuity of care providers, case managers, domestic violence counselors, or similar|
|• Institutional curriculum series on SBP issues: public health, health reform, payor models|
|• Multidisciplinary joint practice conferences|
|• Didactic lecture series focusing on patient disposition|
|• Intern focused lecture curriculum|
|• Senior resident focused lecture curriculum|
|• EM core content practice variability (inter- and intrahospital system)|
|4. Self-directed learning projects (D, PS)|
|• Online educational modules|
|• Mock patient exercises (interns role-playing patients and being admitted through their hospital system)|
|• Home care arrangement assignments|
|• Longitudinal care programs between patients and residents (modeled after programs such as The Harvard Medical School–Cambridge integrated clerkship and the UCSF PISCES programs)|
|• Vanderbilt Healthcare Matrix|
|• Patient logs, portfolios, and other projects with guided self-reflection|
|5. Facilitator-guided rotations (D, PS)|
|• Administrative teaching rotation|
|• Systematic follow-up exercises related to quality and safety (patient logs, radiography overreads, positive cultures)|
|• Hospital committee participation|
Many specific behaviors and skills considered to be most essential and reflective of the practice of EM were elicted from the group. These included tasks such as accurate and timely charting, appropriate patient prioritization and timely reassessment, efficiency relative to peer group, resource utilization, flexible thinking, and sign-out quality. Based on these responses, three representative themes or domains emerged during the group discussion: multitasking, disposition, and patient safety. We then linked the three domains to the five educational learning formats based on their suitability for teaching each specific domain (Table 4). Given the overall process and context under which these domains emerged, we believe that these domains represent the consensus taxonomy of EM-specific SBP competencies (Table 5).
|• Charting: is the resident’s charting accurate, complete, and timely?|
|• What is the resident’s number of patients seen per hour and length of stay data relative to peer group?|
|• Does the resident practice patient follow-up and reassessment during ED evaluation?|
|• Prioritization: does the resident appropriately prioritize tasks?|
|• Does the resident make timely dispositions?|
|• Does the resident employ patient-specific resource utilization?|
|3. Patient safety|
|• Does the resident perform appropriate consultation (neither too cavalier nor “shotgun”)?|
|• Is the resident flexible with respect to resource utilization depending on ED volume?|
|• Does the resident consistently provide complete and relevant sign-outs?|
A summary of the domain-specific educational and evaluation tools was presented at a follow-up session on the second day of the conference. A final round of review from attendees yielded general agreement related to the overall results from the survey and the discussion. From all these responses, we generated the 2010 CORD Consensus Workgroup recommendations for EM-specific domain-based training and evaluation criteria.
SBP Educational Activities
“Expert modeling,” i.e., real-time role modeling of practice and behavior by attending physicians or senior residents, was felt to have the most significant effect on how residents develop their SBP KSAs, specifically with respect to efficiency, mananging patient load, surge situations, and managing situational contingencies (such as time of day/weekend/holiday resource availability and extenuating social circumstances on a case-by-case basis). Sign-out rounds were posited as a practical opportunity for robust “teachable moments” regarding patient disposition and patient safety. EM educators should find ways to maximize this method by formalizing role-modeling activities during clinical shifts or creating after-the-shift debriefing to promote reflection while the relevant experiences are fresh in the learner’s memory. All three domains (multitasking, disposition, and patient safety) can be addressed with this method.
Formal small group activities were considered effective approaches for addressing any of the three learning domains. Multidisciplinary team simulation (e.g., TeamSTEPPS) was considered a high-yield learning intervention, as simulation scenarios can be tailored to focus on many aspects of SBP, including multitasking, coordination of patient care, disposition, patient advocacy and safety, interprofessional teamwork, error identification, and problem-solving related to resource utilization.10,25
The concept of a “SBP meeting” was discussed whereby groups (intra- or interdisciplinary) would work together to solve specific problems within an organization. An example of this would be a problem-solving session involving emergency physicians and cardiologists for the purpose of decreasing admission rates for low-risk chest pain patients. Resident participation in homeless shelter clinics, detox clinics, and other similar social services were felt to be of significant value for enhancing awareness of population-based care issues.
Formal lectures were felt to best address disposition and patient safety domains. Multidisciplinary content was considered essential. Examples included morbidity and mortality conference; monthly didactics from other key stakeholders in patient care, such as continuity of care providers, case managers, domestic violence counselors, or similar; institutional curriculum series on national SBP issues related to public health, health reform, and payor models; multidisciplinary joint practice conferences; and lecture series focusing specifically on patient disposition. Exposure to intra- and interhospital SBP practice variability is a high priority for EM residency program directors. Many expressed that real-world SBP differences between hospitals within a multi-site training program were diverse enough to warrant special emphasis.
A final area of emphasis was to develop postgraduate year (PGY)-specific SBP curricular content. Program directors recognize that SBP issues for interns differ compared to intermediate- and senior-level residents and that training level-specific topics would be more readily assimilated. For example, graduating residents entering the professional job market will need to be able to adapt their practices to a wide variety of local and regional SBP characteristics. Self-directed learning projects were considered effective methods for educating residents on disposition and patient safety SBP domains. The suggestions provided by program directors were characteristic of portfolios and other projects with guided self-reflection. These included activities such as online educational modules, mock patient exercises (where interns engaged in role-playing as patients and experienced the admission process through their hospital systems), home care arrangement assignments, longitudinal care programs between patients and residents (such as those modeled after The Harvard Medical School–Cambridge integrated clerkship and the UCSF PISCES programs), and other formally described conceptual frameworks such as the Vanderbilt Healthcare Matrix.26
Facilitator-guided rotations were felt to best address disposition and patient safety domains. Suggested programs included administrative teaching rotations, systematic follow-up exercises related to quality and safety (patient logs, radiography overreads, positive cultures), and directed hospital committee participation with predefined learning objectives including requiring formal write-up of the experience.
Another rationale for SBP education is that the general competencies are integral to the American Board of Emergency Medicine (ABEM) Emergency Medicine Continuous Certification program. Residents should be taught that these principles are not just a requirement of residency training, but an expectation for lifelong learning and self-assessment as an ABEM diplomate.27
A wide variety of barriers to the more formalized SBP training were listed. The most commonly listed reasons were lack of faculty familiarity of SBP terminology as defined by the ACGME, time constraints, monetary limitations, and lack of faculty protected time. These issues are some of the most frustrating to program directors, as their resolutions are generally not under the program director’s direct control. Bower et al.28 have described another barrier observed in other disciplines where maintenance of certification (MOC) programs already exist: “There is little attraction to competency-based educational activities despite their requirement for MOC. The apparent disparity between the instructional methods a learner prefers and those that are the most effective in changing physician behavior may represent a barrier to participating in more innovative CME offerings and instructional methods.” This observation should be considered when developing resident SBP educational content and may have implications for how ABEM develops SBP-related MOC content for its EM diplomates.
SBP Evaluation Domains
The responses to SBP evaluation strategies revealed a disconnect between the satisfaction of teaching SBP and the sense of burden related to its evaluation. The difficulty centered around constructing an evaluation process considered meaningful by faculty evaluators and the residents being evaluated, rather than creating another form to be “filled out.”
The general consensus was that faculty development would provide high-yield benefit with respect to improving participation in and quality of SBP evaluation. Our survey showed that program directors felt that many faculty knew qualitatively what practice elements constituted SBP, but were not comfortable or knowledgeable about the terminology used to define or evaluate it. It was felt that clinical faculty would be more inclined to teach and provide formative SBP feedback if they knew what they were looking for and were given the proper tools to teach it during clinical encounters. Explicit items determined by the group to be necessary in this process included knowledge of 1) EM SBP domain-specific actions, 2) specific PGY-level expectations, 3) EM SBP KSA, 4) how to link the actions to the domains, and 5) the tools that facilitate evaluation via descriptive anchors. These anchors would then provide the framework and language to facilitate real-time formative feedback.
Another aspect discussed was the variability of national, regional, and interinstitutional SBP practices. The consensus was that it is important to teach the local practice patterns, acknowledging that SBP may even vary significantly between the training sites within an individual teaching program.
The group expressed a need for research-driven validation of measurement instruments. Examples suggested included shift cards, level-based assessments focusing on KSAs that are appropriate for current level of training, peer evaluations, portfolios, and tracking criteria used in community practice (such as length of stay, patients seen per hour, charting, and billing). It was acknowledged that the last example was subject to significant factors outside the personal control of the resident.
The group felt that readily observable and specific actions anchored to each domain (Table 5) would be the most diagnostic and would most likely facilitate real-time formative feedback and subsequently summative feedback. While development of a standardized evaluation tool incorporating these anchors for use by all programs is a natural next step, doing so was beyond the scope of this workshop. Program directors can use these anchors to devise their own educational and evaluation modalities that fit within their local practices.
Multitasking actions that the group considered to be important focused on the following clinical questions: 1) Are the residents’ charts accurate, complete, and timely? Does the resident’s charting reflect a good balance of timeliness and completeness (i.e., not spending too much time charting on one patient to the detriment of patient throughput, not spending extraordinary time after shift finishing charts; documenting events accurately and in a manner that clearly transmits medical decision making)? 2) What is the resident’s number of patients seen per hour and length of stay data relative to peer group? Can the resident handle a similar load as his or her peers given the same ED census or do the other residents or attending physicians need to consistently compensate? 3) Does the resident practice patient follow-up and reassessment during ED evaluation? Does a resident reevaluate a patient autonomously after provision of care and, if so, how effective and timely are the resident’s reassessments? 4) Does the resident appropriately prioritize tasks? Does the resident consistently prioritize critical actions correctly? Is the resident able to prioritize and follow through on multiple tasks for multiple patients at a level commensurate with peers? Does the resident prioritize patient care above charting?
These areas were felt to be the most important discriminators of advancing skill between PGYs. The group recognized that patients seen per hour and length of stay were, to an extent, beyond the resident’s control. However, the discussion revealed that there was significant agreement among most program directors that distinct levels of proficiency are easily observable among similar year residents working in the same department based on their ability to navigate patients through their health system and to provide efficient autonomous care.
Disposition actions felt to be most germane to ED SBP competence included: 1) Does the resident make timely dispositions? 2) Does the resident employ patient-specific resource utilization? One of the hallmark competencies of an emergency physician is the ability to make dispositions in a reasonable time frame to manage patient flow and fulfill the responsibility of providing care to patients in the waiting room. Resident qualities that program directors expressed as discriminatory of this skill included proficiency in calling consultants and primary care providers (sometimes with incomplete information), to admit or discharge patients, and the ability to make dispositions without the performance of serial workups.
With respect to resource utilization, EM program directors felt that a resident’s ability to be flexible and creative in making disposition arrangements when significant external limitations (weekend/holiday/night, availability of consultants or medical testing, patient socioeconomic constraints, patient living circumstances and follow-up ability, etc.) exist is an indicator of advancing competence.
With respect to the patient safety SBP competency, the most often cited questions that program directors felt were most diagnostic included: 1) Does the resident obtain appropriate consultation (neither too cavalier nor “shotgun”)? 2) Is the resident flexible with respect to resource utilization depending on ED volume? and 3) Does the resident consistently provide complete and relevant sign-outs? Most program directors felt that the residents’ development of clinical reasoning and acumen coincide with the ability to articulate their rationale for obtaining consultation, to discern those clinical situations that require emergent consultation because of the likelihood of significant sequelae, and to know which consultant is the most appropriate for a clinical situation. Another aspect noted by program directors is the ability of a resident to be able to manage resources flexibly based on the ED volume. Does the resident know how to “shift gears” when it is busy? Does the resident have awareness of when the nursing team members and staff are overloaded? Can the resident direct team members effectively to perform patient care in a way that provides optimal care to the sickest patients and minimizes latent threats to patient safety? Can the resident discern between essential testing versus more elective testing and communicate with the inpatient team which tests or consultations need to be performed in the ED and which can be performed on admitted patients from an inpatient unit? Does the resident understand the concept of patient surge and how to ration resources under this condition? Last, there was general consensus that a resident’s SBP proficiency could be objectively observed during the process of sign-out and patient handoffs at change of shift. Can the resident succinctly sign out every patient for whom he or she is responsible? Does the resident provide clear understanding of and convey the critical management and dispositions for each patient? Can the resident provide appropriate contingency plans for patients who are in the middle of their evaluations?
Summative and formative evaluation methods using these domain-specific anchors were discussed. For summative processes, integration of these SBP anchors into established global evaluation tools was felt to be the most feasible way of implementation. The primary summative value expressed by program directors was in identifying outliers within their residency classes. For formative processes, competency-based shift cards with the SBP anchors were felt to be most feasible for implementation. This method incorporates the preferred method of a Global Rating of Live Performance. An important factor would be establishing expectations that the resident provide the shift card to the supervising attending physician(s) after each shift and that attendings provide contemporaneous feedback using the anchors provided on the shift card as a guide. The group also felt that this could be used as part of remediation for outliers.
Lurie et al.15 have previously questioned the possibility of developing valid and reliable psychometric tools to assess the ACGME competencies as individual constructs. Our findings support this assertion, as we found that, at times, consensus was difficult to develop for the specific items to be assessed for each of the SBP domains. Consequently, we felt development of a specific evaluation “tool” based on the domains we generated during the conference was beyond the scope and time constraints imposed by the conference format. While we were able to tease out the evaluation domains, regional and local variability with respect to systems issues made it difficult to come to consensus on an assessment tool that would accurately capture the true measure of skill acquisition by all residents in all programs. Until and if a specific evaluation tool is developed, individual residency programs should determine how best to measure the domain criteria, to develop their own minimum standards, and to allocate time and resources to these activities.
There were several limitations to our process. The preconference survey has its own inherent limitations. First, the response rate was 44%, leaving a relatively large nonresponder pool. However, this response rate is actually a greater response rate than most surveys that go out via the CORD e-mail list serve. Furthermore, as the surveys were anonymous, we know nothing about the nonresponders, and this could have imposed bias. Second, the survey was created by the authors and was not prospectively validated as a survey instrument specifically for the purpose of capturing all of the relevant issues related to EM-specific SBP. Most of the questions on the survey instrument had closed-ended response items. This potentially created artificial constructs that limited the ability of the respondents to answer the questions precisely. These, in turn, may have influenced the responses generated during the consensus session. Additionally, the survey was created so as to develop a ranking system for the educational and evaluation modalities that would be used as a framework and basis for Phase 2 of the project. Therefore, there was not an attempt to generate a survey instrument that lent itself to precise statistical analysis.
We did not determine the proportion of overlap between the survey respondents and the conference attendees. If all the attendees were survey respondents, we would not have captured the response of some of the nonresponders. However, we did determine that conference attendees responded using lectures more frequently than survey responders, so we conclude that there was not complete overlap. Furthermore, the attendance of the CORD Academic Assembly has traditionally been a nationally representative one, and was so during this conference, so we do not think that this affected the generalizability of the results.
Another limitation is that the consensus conference was held over 2 days, and there may not have been perfect overlap between the content discussed on Days 1 and 2. Just as there is a substantial overlap in the content and evaluation methods for all six ACGME general competencies, the taxonomy developed for SBP has qualities that overlap with many of the other competencies and, therefore, SBP (and the other general competencies, as well) should not be thought of as a construct in isolation. For example, elements of the “patient safety” domain also require proficiency in patient care, medical knowledge, and professionalism.
We conducted a consensus session at the 2010 CORD Academic Assembly designed to define broadly accepted EM-specific SBP competency skills using a modified Delphi method. From this process, we established several concepts that will be useful for EM residency program directors when teaching or evaluating the residents in the SBP general competency. First, we refined the initial 2002 EM SBP competency definitions and devised a taxonomy of EM-specific domains (disposition, multitasking, and patient safety). These reflect the core SBP KSAs that EM program directors felt were critical for graduates to understand and perform well. Second, we solicited currently successful educational and evaluative tools from a large group of program directors and developed a framework to link them to the defined domains.
Model educational programs were provided as examples of how to achieve learning goals. Specific observable tasks that can be molded into evaluative anchors for each of the domains were identified to assist program directors in providing their faculty with a way to better understand the competency and facilitate the provision of constructive feedback. The results of this process allowed for clarification of definitions of EM-specific domains and observable tasks, defining best practices for teaching SBP with respect to resident education and faculty development and defining best practices for evaluating SBP.
In 2009, Lurie et al.15 evaluated the available published ACGME general competency studies to determine whether they could be measured in a valid and reliable way. They determined that “The peer-reviewed literature provides no evidence that current measurement tools can assess the competencies independently of one another.” They further concluded that “Because further efforts are unlikely to be successful, [they] recommend using the competencies to guide and coordinate specific evaluation efforts, rather than attempting to develop instruments to measure the competencies directly.”15 Rather than provide specific assessment tools for SBP evaluation, our process is aligned with that of Lurie et al. and has resulted in a consensus methodology to provide EM residency program directors with the framework to educate and evaluate SBP in a meaningful way within the context of all six general competencies.
During the initial development of the general competencies, the ACGME recognized the critical need for physicians to be able to navigate the complex and dynamic medical system itself as a key component of successful and effective patient care. Thus it is important to remember that the goal of teaching and evaluating SBP is not to generate additional paperwork and invest limited educational resources in order to measure potentially artificial or nonapplicable proxies simply to satisfy a requirement, even one provided by the ACGME. Rather, EM residency programs have a responsibility to demonstrate to the ACGME and the general public that our residents “get it and can do the job.”