Assessing Systems-based Practice

Authors


  • The list of breakout session participants can be found as the appendix of a related article on page 1486.

  • This paper reports on a workshop session of the 2012 Academic Emergency Medicine consensus conference, “Education Research in Emergency Medicine: Opportunities, Challenges, and Strategies for Success,” May 9, 2012, Chicago, IL.

  • The authors have no relevant financial information or potential conflicts of interest to disclose.

Address for correspondence and reprints: Esther H. Chen, MD; e-mail: esther.chen@emergency.ucsf.edu.

Abstract

The conceptual definition of systems-based practice (SBP) does not easily translate into directly observable actions or behaviors that can be easily assessed. At the Academic Emergency Medicine consensus conference on education research in emergency medicine (EM), a breakout group presented a review of the literature on existing assessment tools for SBP, discussed the recommendations for research tool development during breakout sessions, and developed a research agenda based on this discussion.

The inclusion of systems-based practice (SBP) as one of the Accreditation Council for Graduate Medical Education (ACGME) core competencies highlights the importance of the health care system on the ability of a physician to provide competent and effective patient care.[1] Physicians must be able to collaborate with other members of the health care team, consider costs when weighing risks and benefits, improve system performance by identifying system errors and implementing potential solutions, and continue to advocate for quality patient care. This set of skills is also described in the CanMEDS Physician Competency Framework as elements of the physician role as a collaborator, manager, and health advocate.[2]

Despite a recent shift in the ACGME assessment system to the Next Accreditation System, the framework for assessing milestones within the competencies still requires specific tools to assess SBP.[3] Although the conceptual definition of SBP does not easily translate into observable actions or behaviors, the Council of Emergency Medicine Residency Directors (CORD) was able to develop emergency medicine (EM)-specific evaluation domains for SBP.[4] These included specific actions and behaviors that are directly observable and facilitated real-time and summative feedback. However, this set of observable behaviors did overlook some pertinent areas, such as situational awareness and participation in systems improvement. Similarly, by using the Can-MEDS roles of collaborator, manager, and health advocate, Graham et al. also developed a comprehensive list of observable actions that readily translated into assessments.[5]

This article summarizes the authors' review of the current assessment tools for the SBP competency used in EM and non-EM residencies, both within and outside the United States. As a result of several small group discussions during the breakout session on the assessment of observable learner performance at the 2012 Academic Emergency Medicine (AEM) consensus conference on education research in EM, we developed a research plan for assessment tool development.

Process and Consensus

We reviewed the literature on SBP assessment, searching the Medline and Pubmed databases by combining the terms “systems based” with “evaluat* OR assess*” and “competenc*” as well as combining “simulation” or “portfolio” with “systems based practice,” yielding 156 and 309 references, respectively. An additional search was performed in the MedEdPORTAL database (https://www.mededportal.org) with the terms “systems based practice” and “assessment,” which resulted in 47 resources. The bibliographies of all relevant articles were reviewed for additional citations.

The most prevalent studies in the current education literature describe innovative approaches to teaching SBP and assessment of the learners' knowledge based on those educational interventions. Some examples include resident involvement in quality improvement projects, implementation of an SBP curriculum (e.g., managed care, root cause analysis, the economic and business elements of the health care system), and resident participation in a patient panel conference where patients discuss their experiences with the health care system.[6-10] Other educational innovations included round-table discussions about nonclinical skills (e.g., prioritization, efficiency, accountability), an interspecialty airway course, and a Health Advocacy Day that showcases community health advocates.[11] There are fewer studies of competency assessments that either reflect a learner's ability and skill or measure actual patient outcomes. These assessment instruments are described in the following section.

Situational Awareness and Teamwork Scoring Systems

Undergraduate and graduate medical educators frequently use low- and high-fidelity mannequins or standardized patients to assess teamwork and communication skills.[12-14] This format incorporates direct observation of the learner by faculty using standardized checklists with scenario-specific SBP competency criteria or standardized instruments (e.g., Behaviorally Anchored Rating Scale, Teamwork and Patient Safety Attitudes Questionnaire, Perceived Stress Scale, Anaesthetists' Non-Technical Skills System, and the Ottawa Crisis Resource Management Global Rating Scale), followed by debriefing and feedback.[12-18] One study showed that simulated teamwork training as an educational strategy improved the quality of team behavior (as observed in the actual clinical setting) when compared to simply working together as a team in the clinical area without the simulation.[12] Otherwise, there is little evidence that simulation training can directly improve a resident's clinical performance.

Direct Observation Assessment Tools

Perhaps the best representation of resident behavior is direct observation of residents providing clinical care in their practice environment, known as workplace-based assessment. Some educators will argue that this is the ideal method of assessing competency because it provides the context of professional practice rather than a simulated or standardized encounter.[19, 20] A systematic review of direct observation assessment tools showed that the mini-clinical evaluation exercise (mini-CEX) had the strongest validity evidence of all the instruments.[21, 22] A widely used instrument in EM is the Standardized Direct Observation Assessment Tool (SDOT), which provides an objective, efficient way to assess learners in the clinical environment.[23, 24] Although this tool has good inter-rater reliability when used by faculty from different institutions, validity evidence is lacking.[24, 25]

Direct observation of residents in simulated environments can also be used to assess behavior. Simulated cases using standardized patients, formally called the objective structured clinical examination (OSCE), can create authentic interactions for the learner. Performance in OSCEs have been shown to correlate with future clinical performance in medical students.[26] A study of first-year EM residents, however, showed no correlation between the SBP competency scores on the SDOT for five OSCE cases, compared to the cumulative composite SBP score on the ACGME global resident competency form completed by faculty for each resident during the EM rotation over 18 months.[27, 28] However, the low reliability of global evaluations may account for this finding rather than a failure to measure SBP in the OSCE. Other studies using OSCEs, however, have demonstrated reliable measurement of clinical skills in all the ACGME general competencies and the CanMEDS physician competencies.[29, 30] An internal medicine program developed a 12-station Objective Structured System-Interaction Examination (OSSIE), with situations involving patient handoffs, complicated discharges, consultations, cost-effective diagnostics, evidence-based health promotion, and informed consent.[31] This work provides more evidence that simulation may be a valid approach to assessing SBP.[32] While it is difficult to draw definitive conclusions based on these studies, they do highlight the need for more research on the efficacy of using simulation, considering the time and personnel resources involved, in assessing clinical performance.

Ratings and Survey-based instruments

Multisource feedback and ratings of residents offer different perspectives into a learner's interactions with “the village” (the integral players in the health care system) and theoretically can provide a more comprehensive picture of resident performance.[33] Nursing evaluation of residents, patient surveys, chart self-audits, and program director evaluation of learning portfolios were all used in this internal medicine program to assess SBP competency. On self-assessment, residents reported an improvement in their ability to access and utilize resources, providers, and systems to provide optimal patient care.[33]

While they do not solely assess the SBP competency, global rating scales are widely used to evaluate EM residents and therefore are included here. Eight EM residency programs participated in a study to develop, implement, and obtain evidence to support the validity of a global rating tool.[34, 35] This instrument had acceptable reliability statistics and demonstrated the progressive acquisition of the SBP competency across three years of residency training.

Quality Improvement Projects

Several medical disciplines teach and assess SBP through development and participation in quality improvement projects.[6, 36-39] Some studies use objective quality measures or process outcomes for competency assessment, such as a decrease in laboratory fees for pediatric inpatients by altering the ordering system or developing clinical guidelines to improve clinical care in an intensive care unit.[6, 36] Others use resident self-assessment or a faculty's global assessment of a group's performance, rather than an individual's performance.[37-39]

One particular study is worth highlighting for its innovative assessment process. Although not an SBP-specific evaluation tool, the health care matrix is a diagnostic tool that incorporates all of the ACGME competencies and the Institute of Medicine aims for improvement.[40] In this study, each internal medicine resident used the matrix to analyze the care provided for one patient and presented this self-analysis to peers who provided feedback. The matrix assists the resident in identifying the systems issues that prevented the patient from receiving optimal care and generating ideas for quality improvement projects. It provides an analytical framework that enables the resident to critically reflect on his or her own performance using the competency language, while also identifying ways to improve care based on evidence. A critical step in this process is a faculty facilitator or “expert” who understands the case and the systems issues and can provide formative feedback on the work.

Portfolio

The portfolio can be used for formative and summative assessment of the competencies, although its reliability as an evaluation tool improves with regular feedback from a mentor and well-designed entries that address specific competencies and encourage reflection.[41, 42] Since portfolios are based on actual work, they are considered to be a more accurate representation of learner performance, and therefore a more valid measure of the competencies, than other assessment tools. Following several proposed suggestions for implementation into the EM curriculum, the EM literature currently reports individual resident and faculty reflections on cases that address systems issues (e.g., multicasualty incident, resource utilization, and medical error).[43-45] Achievement of the competency is illustrated by evidence of self-assessment (inherent in the reflection) as well as by the subjective assessment from a faculty member who provides feedback on the entry.

Other programs have coupled portfolio entries with a particular activity that focuses on systems issues. For example, after presenting in morbidity and mortality conference, surgery residents must also complete a form in their portfolio that describes the factors that contributed to the complication and/or mortality, the opportunities for systems improvement, and the specific plan for improvement.[46] The resident then receives feedback during the group discussion and from the surgery residency director who regularly reviews the portfolio entries. In a psychiatry program where portfolios were well integrated into the resident evaluation system, the residents were required to submit their best work that demonstrated acquisition of 13 essential psychiatric skills, 10 of which met the SBP competency.[47] Two faculty members evaluated each portfolio and assessed whether the residents satisfied the competency. Portfolios have also been linked to an active learning experience in which a resident assumes the role of a parent faced with complex life situations while the resident's colleagues (acting as the physicians) prioritize the problems and access community resources to address them.[48] The resident then documents the scenario experiences in his or her portfolio to demonstrate competency.

Finally, the portfolio can be used to track a resident's progress on a quality improvement project and provide a summative assessment of learner performance. At one institution, a multidisciplinary group of radiology and EM faculty and residents worked together to improve the efficiency of the radiology process starting from ordering a diagnostic study to receiving the final reading.[49] A needs assessment was performed, cases were discussed jointly during conferences, and suggestions for improvement were made. The outcomes achieved by this project included systems changes to improve communication between the two departments, decreased patient transport time, and improved understanding of the difficulties in providing clinical services in both departments. Participation in this problem-solving project was documented by self-reflection entries in the resident portfolio that was regularly reviewed by the program directors. This “plan-do-study-act” approach to systems improvement taught SBP concepts, improved the system with measurable clinical outcomes, and reflected a learner's performance.[50]

Although we have described several assessment methods for SBP, if we use the hierarchy of competencies described by Miller[51] as the conceptual framework by which we evaluate these instruments, we find several limitations. First and foremost, many SBP studies are focused on effective instructional methods or program environments, rather than determining the ability of an individual learner.[33, 46, 48, 52] We need to link assessment to the skills of individual learners when performing or applying the knowledge in clinical practice, which is the highest level of competency in Miller's schema. This perspective supports the use of workplace-based assessments. Therefore, a good assessment process (Figure 1) should be context-specific (i.e., using a sample of cases rather than a single case); require sophisticated judgment by physicians to account for context rather than checklists; occur in the real workplace rather than a standardized, controlled environment; and use a rating scale with constructs of developing clinical sophistication and independence rather than conventional gradations of performance (e.g., unsatisfactory to superior, below expectations to above expectations).[53] More research on developing scales using anchors with observable behaviors to demonstrate progressive clinical independence is necessary for this assessment process. This measurement and achievement of developing behaviors, or milestones, is now a key component of ACGME's Next Accreditation System.[3]

Figure 1.

The assessment of competency in medical education. The arrow indicates the goal of a good assessment process that should move toward the highest level of competency, as assessed by physician judgment based on direct observation of the learner in the actual workplace.

A second limitation of the current evaluation instruments is that only a few have shown that the scores are valid in EM. Data obtained from tools that have not undergone rigorous testing may not actually measure the constructs of interest.[23, 54] We are only assuming that instruments that have been tested elsewhere may be generalizable to the EM residency setting.

Finally, some educators believe that each competency should not be assessed independently of the other competencies because these behaviors cannot be isolated and must be evaluated in context with interpersonal and communication skills and practice-based learning and improvement.[4, 55] For example, residents who are good at managing a team during resuscitations of acutely ill patients are good communicators, demonstrate good patient care and problem-solving skills, and understand how to use resources effectively. However, it is unclear whether a more global view adequately evaluates all the content domains of the SBP competency.

Recommendations

  • 1.Situational awareness and teamwork scoring systems: Currently, only a few EM studies of simulation use rating systems other than checklists of critical actions to assess resident performance.

Proposed research agenda:

  1. Collect evidence for validity of the various teamwork and situational awareness rating scales using simulation in EM education.
  2. Demonstrate an association between performance in simulation and patient care.
  • 2.Direct observation assessment tools: Of the many available direct observation assessment tools, the SDOT is the most widely used in EM.

Proposed research agenda:

  1. Refine the current direct observation assessment tools (e.g., mini-CEX, SDOT) using scales with progressively developing observable behaviors or milestones.
  2. Determine evidence of validity for these assessments, including the optimal number of observations and association with other performance outcomes.
  • 3.Ratings and survey-based instruments: Currently, one global rating scale has validity evidence for the assessment of EM resident performance. Patient surveys and nursing surveys used to provide feedback on resident behavior have not been rigorously tested. Proposed research agenda:
  1. Develop forms to be used by multiple evaluators (multisource feedback) to measure EM resident performance in the SBP domains using scales with progressively developing observable behaviors or milestones.
  2. Determine evidence of validity for the effect of multisource feedback on resident performance.
  • 4.Quality improvement projects: Currently, self-assessment, global assessment by faculty, and achievement of the clinical outcome for quality improvement projects are tools used to assess resident competency.

Proposed research agenda:

  1. Develop structured self-assessment tools that describe residents with different engagement, abilities, and skills in quality improvement.
  2. Develop tools for assessing quality improvement projects (e.g., checklist for chart review) using objective quality measures for SBP competency.
  • 5.Portfolio: Portfolios are learner-generated documentation of residency-specific activities that reflect specific aspects of the ACGME competencies.

Proposed research agenda:

  1. Develop self-assessment tools that encourage informed self-assessment and reflection on specific SBP domains and behaviors.[56]
  2. Develop structured scoring rubrics for faculty to use in assessing portfolio reflections.

Summary

Systems-based practice is a complex concept that highlights the importance of the health care system on the ability of a physician to provide good patient care and requires the physician to be a good manager, collaborator, and patient advocate. While it may be challenging to assess SBP independently of the other competencies, a comprehensive assessment should be multimodal and include direct observation by expert clinicians in the actual workplace.

Ancillary