Simulation in Graduate Medical Education 2008: A Review for Emergency Medicine

Authors

  • Steve McLaughlin MD,

    1. From the Department of Emergency Medicine, University of New Mexico (SM), Albuquerque, NM; the Department of Emergency Medicine, Wake Forest University (MTF), Winston-Salem, NC; the Department of Emergency Medicine, Mayo Clinic Rochester (DGG, TAL), Rochester, MN; Gilbert Program in Medical Simulation, Harvard Medical School and the Department of Emergency Medicine, Massachusetts General Hospital (EH, JAG), Boston, MA; New York Methodist Hospital (CYK), Brooklyn, NY; the Integrated Residency in Emergency Medicine at the University of Connecticut (TN), Farmington, CT; the Department of Emergency Medicine, Mount Sinai School of Medicine (YO), New York, NY; the Department of Emergency Medicine, Vanderbilt University Medical Center (KP), Nashville, TN; the STRATUS Center for Medical Simulation, Brigham and Women’s Hospital (Emergency Medicine), Harvard Medical School (CNP), Boston, MA; the Northwestern McGaw Simulation Network, Medical Education, Feinberg School of Medicine, Northwestern University, Division of Emergency Medicine, Evanston Northwestern Healthcare (JV), Evanston, IL; and the Division of Emergency Medicine, McGaw Medical Center of Northwestern University, Evanston Hospital (EW), Evanston, IL.
    Search for more papers by this author
  • Michael T. Fitch MD, PhD,

    1. From the Department of Emergency Medicine, University of New Mexico (SM), Albuquerque, NM; the Department of Emergency Medicine, Wake Forest University (MTF), Winston-Salem, NC; the Department of Emergency Medicine, Mayo Clinic Rochester (DGG, TAL), Rochester, MN; Gilbert Program in Medical Simulation, Harvard Medical School and the Department of Emergency Medicine, Massachusetts General Hospital (EH, JAG), Boston, MA; New York Methodist Hospital (CYK), Brooklyn, NY; the Integrated Residency in Emergency Medicine at the University of Connecticut (TN), Farmington, CT; the Department of Emergency Medicine, Mount Sinai School of Medicine (YO), New York, NY; the Department of Emergency Medicine, Vanderbilt University Medical Center (KP), Nashville, TN; the STRATUS Center for Medical Simulation, Brigham and Women’s Hospital (Emergency Medicine), Harvard Medical School (CNP), Boston, MA; the Northwestern McGaw Simulation Network, Medical Education, Feinberg School of Medicine, Northwestern University, Division of Emergency Medicine, Evanston Northwestern Healthcare (JV), Evanston, IL; and the Division of Emergency Medicine, McGaw Medical Center of Northwestern University, Evanston Hospital (EW), Evanston, IL.
    Search for more papers by this author
  • Deepi G. Goyal MD,

    1. From the Department of Emergency Medicine, University of New Mexico (SM), Albuquerque, NM; the Department of Emergency Medicine, Wake Forest University (MTF), Winston-Salem, NC; the Department of Emergency Medicine, Mayo Clinic Rochester (DGG, TAL), Rochester, MN; Gilbert Program in Medical Simulation, Harvard Medical School and the Department of Emergency Medicine, Massachusetts General Hospital (EH, JAG), Boston, MA; New York Methodist Hospital (CYK), Brooklyn, NY; the Integrated Residency in Emergency Medicine at the University of Connecticut (TN), Farmington, CT; the Department of Emergency Medicine, Mount Sinai School of Medicine (YO), New York, NY; the Department of Emergency Medicine, Vanderbilt University Medical Center (KP), Nashville, TN; the STRATUS Center for Medical Simulation, Brigham and Women’s Hospital (Emergency Medicine), Harvard Medical School (CNP), Boston, MA; the Northwestern McGaw Simulation Network, Medical Education, Feinberg School of Medicine, Northwestern University, Division of Emergency Medicine, Evanston Northwestern Healthcare (JV), Evanston, IL; and the Division of Emergency Medicine, McGaw Medical Center of Northwestern University, Evanston Hospital (EW), Evanston, IL.
    Search for more papers by this author
  • Emily Hayden MD,

    1. From the Department of Emergency Medicine, University of New Mexico (SM), Albuquerque, NM; the Department of Emergency Medicine, Wake Forest University (MTF), Winston-Salem, NC; the Department of Emergency Medicine, Mayo Clinic Rochester (DGG, TAL), Rochester, MN; Gilbert Program in Medical Simulation, Harvard Medical School and the Department of Emergency Medicine, Massachusetts General Hospital (EH, JAG), Boston, MA; New York Methodist Hospital (CYK), Brooklyn, NY; the Integrated Residency in Emergency Medicine at the University of Connecticut (TN), Farmington, CT; the Department of Emergency Medicine, Mount Sinai School of Medicine (YO), New York, NY; the Department of Emergency Medicine, Vanderbilt University Medical Center (KP), Nashville, TN; the STRATUS Center for Medical Simulation, Brigham and Women’s Hospital (Emergency Medicine), Harvard Medical School (CNP), Boston, MA; the Northwestern McGaw Simulation Network, Medical Education, Feinberg School of Medicine, Northwestern University, Division of Emergency Medicine, Evanston Northwestern Healthcare (JV), Evanston, IL; and the Division of Emergency Medicine, McGaw Medical Center of Northwestern University, Evanston Hospital (EW), Evanston, IL.
    Search for more papers by this author
  • Christine Yang Kauh MD,

    1. From the Department of Emergency Medicine, University of New Mexico (SM), Albuquerque, NM; the Department of Emergency Medicine, Wake Forest University (MTF), Winston-Salem, NC; the Department of Emergency Medicine, Mayo Clinic Rochester (DGG, TAL), Rochester, MN; Gilbert Program in Medical Simulation, Harvard Medical School and the Department of Emergency Medicine, Massachusetts General Hospital (EH, JAG), Boston, MA; New York Methodist Hospital (CYK), Brooklyn, NY; the Integrated Residency in Emergency Medicine at the University of Connecticut (TN), Farmington, CT; the Department of Emergency Medicine, Mount Sinai School of Medicine (YO), New York, NY; the Department of Emergency Medicine, Vanderbilt University Medical Center (KP), Nashville, TN; the STRATUS Center for Medical Simulation, Brigham and Women’s Hospital (Emergency Medicine), Harvard Medical School (CNP), Boston, MA; the Northwestern McGaw Simulation Network, Medical Education, Feinberg School of Medicine, Northwestern University, Division of Emergency Medicine, Evanston Northwestern Healthcare (JV), Evanston, IL; and the Division of Emergency Medicine, McGaw Medical Center of Northwestern University, Evanston Hospital (EW), Evanston, IL.
    Search for more papers by this author
  • Torrey A. Laack MD,

    1. From the Department of Emergency Medicine, University of New Mexico (SM), Albuquerque, NM; the Department of Emergency Medicine, Wake Forest University (MTF), Winston-Salem, NC; the Department of Emergency Medicine, Mayo Clinic Rochester (DGG, TAL), Rochester, MN; Gilbert Program in Medical Simulation, Harvard Medical School and the Department of Emergency Medicine, Massachusetts General Hospital (EH, JAG), Boston, MA; New York Methodist Hospital (CYK), Brooklyn, NY; the Integrated Residency in Emergency Medicine at the University of Connecticut (TN), Farmington, CT; the Department of Emergency Medicine, Mount Sinai School of Medicine (YO), New York, NY; the Department of Emergency Medicine, Vanderbilt University Medical Center (KP), Nashville, TN; the STRATUS Center for Medical Simulation, Brigham and Women’s Hospital (Emergency Medicine), Harvard Medical School (CNP), Boston, MA; the Northwestern McGaw Simulation Network, Medical Education, Feinberg School of Medicine, Northwestern University, Division of Emergency Medicine, Evanston Northwestern Healthcare (JV), Evanston, IL; and the Division of Emergency Medicine, McGaw Medical Center of Northwestern University, Evanston Hospital (EW), Evanston, IL.
    Search for more papers by this author
  • Thomas Nowicki MD,

    1. From the Department of Emergency Medicine, University of New Mexico (SM), Albuquerque, NM; the Department of Emergency Medicine, Wake Forest University (MTF), Winston-Salem, NC; the Department of Emergency Medicine, Mayo Clinic Rochester (DGG, TAL), Rochester, MN; Gilbert Program in Medical Simulation, Harvard Medical School and the Department of Emergency Medicine, Massachusetts General Hospital (EH, JAG), Boston, MA; New York Methodist Hospital (CYK), Brooklyn, NY; the Integrated Residency in Emergency Medicine at the University of Connecticut (TN), Farmington, CT; the Department of Emergency Medicine, Mount Sinai School of Medicine (YO), New York, NY; the Department of Emergency Medicine, Vanderbilt University Medical Center (KP), Nashville, TN; the STRATUS Center for Medical Simulation, Brigham and Women’s Hospital (Emergency Medicine), Harvard Medical School (CNP), Boston, MA; the Northwestern McGaw Simulation Network, Medical Education, Feinberg School of Medicine, Northwestern University, Division of Emergency Medicine, Evanston Northwestern Healthcare (JV), Evanston, IL; and the Division of Emergency Medicine, McGaw Medical Center of Northwestern University, Evanston Hospital (EW), Evanston, IL.
    Search for more papers by this author
  • Yasuharu Okuda MD,

    1. From the Department of Emergency Medicine, University of New Mexico (SM), Albuquerque, NM; the Department of Emergency Medicine, Wake Forest University (MTF), Winston-Salem, NC; the Department of Emergency Medicine, Mayo Clinic Rochester (DGG, TAL), Rochester, MN; Gilbert Program in Medical Simulation, Harvard Medical School and the Department of Emergency Medicine, Massachusetts General Hospital (EH, JAG), Boston, MA; New York Methodist Hospital (CYK), Brooklyn, NY; the Integrated Residency in Emergency Medicine at the University of Connecticut (TN), Farmington, CT; the Department of Emergency Medicine, Mount Sinai School of Medicine (YO), New York, NY; the Department of Emergency Medicine, Vanderbilt University Medical Center (KP), Nashville, TN; the STRATUS Center for Medical Simulation, Brigham and Women’s Hospital (Emergency Medicine), Harvard Medical School (CNP), Boston, MA; the Northwestern McGaw Simulation Network, Medical Education, Feinberg School of Medicine, Northwestern University, Division of Emergency Medicine, Evanston Northwestern Healthcare (JV), Evanston, IL; and the Division of Emergency Medicine, McGaw Medical Center of Northwestern University, Evanston Hospital (EW), Evanston, IL.
    Search for more papers by this author
  • Ken Palm MD,

    1. From the Department of Emergency Medicine, University of New Mexico (SM), Albuquerque, NM; the Department of Emergency Medicine, Wake Forest University (MTF), Winston-Salem, NC; the Department of Emergency Medicine, Mayo Clinic Rochester (DGG, TAL), Rochester, MN; Gilbert Program in Medical Simulation, Harvard Medical School and the Department of Emergency Medicine, Massachusetts General Hospital (EH, JAG), Boston, MA; New York Methodist Hospital (CYK), Brooklyn, NY; the Integrated Residency in Emergency Medicine at the University of Connecticut (TN), Farmington, CT; the Department of Emergency Medicine, Mount Sinai School of Medicine (YO), New York, NY; the Department of Emergency Medicine, Vanderbilt University Medical Center (KP), Nashville, TN; the STRATUS Center for Medical Simulation, Brigham and Women’s Hospital (Emergency Medicine), Harvard Medical School (CNP), Boston, MA; the Northwestern McGaw Simulation Network, Medical Education, Feinberg School of Medicine, Northwestern University, Division of Emergency Medicine, Evanston Northwestern Healthcare (JV), Evanston, IL; and the Division of Emergency Medicine, McGaw Medical Center of Northwestern University, Evanston Hospital (EW), Evanston, IL.
    Search for more papers by this author
  • Charles N. Pozner MD,

    1. From the Department of Emergency Medicine, University of New Mexico (SM), Albuquerque, NM; the Department of Emergency Medicine, Wake Forest University (MTF), Winston-Salem, NC; the Department of Emergency Medicine, Mayo Clinic Rochester (DGG, TAL), Rochester, MN; Gilbert Program in Medical Simulation, Harvard Medical School and the Department of Emergency Medicine, Massachusetts General Hospital (EH, JAG), Boston, MA; New York Methodist Hospital (CYK), Brooklyn, NY; the Integrated Residency in Emergency Medicine at the University of Connecticut (TN), Farmington, CT; the Department of Emergency Medicine, Mount Sinai School of Medicine (YO), New York, NY; the Department of Emergency Medicine, Vanderbilt University Medical Center (KP), Nashville, TN; the STRATUS Center for Medical Simulation, Brigham and Women’s Hospital (Emergency Medicine), Harvard Medical School (CNP), Boston, MA; the Northwestern McGaw Simulation Network, Medical Education, Feinberg School of Medicine, Northwestern University, Division of Emergency Medicine, Evanston Northwestern Healthcare (JV), Evanston, IL; and the Division of Emergency Medicine, McGaw Medical Center of Northwestern University, Evanston Hospital (EW), Evanston, IL.
    Search for more papers by this author
  • John Vozenilek MD,

    1. From the Department of Emergency Medicine, University of New Mexico (SM), Albuquerque, NM; the Department of Emergency Medicine, Wake Forest University (MTF), Winston-Salem, NC; the Department of Emergency Medicine, Mayo Clinic Rochester (DGG, TAL), Rochester, MN; Gilbert Program in Medical Simulation, Harvard Medical School and the Department of Emergency Medicine, Massachusetts General Hospital (EH, JAG), Boston, MA; New York Methodist Hospital (CYK), Brooklyn, NY; the Integrated Residency in Emergency Medicine at the University of Connecticut (TN), Farmington, CT; the Department of Emergency Medicine, Mount Sinai School of Medicine (YO), New York, NY; the Department of Emergency Medicine, Vanderbilt University Medical Center (KP), Nashville, TN; the STRATUS Center for Medical Simulation, Brigham and Women’s Hospital (Emergency Medicine), Harvard Medical School (CNP), Boston, MA; the Northwestern McGaw Simulation Network, Medical Education, Feinberg School of Medicine, Northwestern University, Division of Emergency Medicine, Evanston Northwestern Healthcare (JV), Evanston, IL; and the Division of Emergency Medicine, McGaw Medical Center of Northwestern University, Evanston Hospital (EW), Evanston, IL.
    Search for more papers by this author
  • Ernest Wang MD,

    1. From the Department of Emergency Medicine, University of New Mexico (SM), Albuquerque, NM; the Department of Emergency Medicine, Wake Forest University (MTF), Winston-Salem, NC; the Department of Emergency Medicine, Mayo Clinic Rochester (DGG, TAL), Rochester, MN; Gilbert Program in Medical Simulation, Harvard Medical School and the Department of Emergency Medicine, Massachusetts General Hospital (EH, JAG), Boston, MA; New York Methodist Hospital (CYK), Brooklyn, NY; the Integrated Residency in Emergency Medicine at the University of Connecticut (TN), Farmington, CT; the Department of Emergency Medicine, Mount Sinai School of Medicine (YO), New York, NY; the Department of Emergency Medicine, Vanderbilt University Medical Center (KP), Nashville, TN; the STRATUS Center for Medical Simulation, Brigham and Women’s Hospital (Emergency Medicine), Harvard Medical School (CNP), Boston, MA; the Northwestern McGaw Simulation Network, Medical Education, Feinberg School of Medicine, Northwestern University, Division of Emergency Medicine, Evanston Northwestern Healthcare (JV), Evanston, IL; and the Division of Emergency Medicine, McGaw Medical Center of Northwestern University, Evanston Hospital (EW), Evanston, IL.
    Search for more papers by this author
  • James A. Gordon MD, MPA,

    1. From the Department of Emergency Medicine, University of New Mexico (SM), Albuquerque, NM; the Department of Emergency Medicine, Wake Forest University (MTF), Winston-Salem, NC; the Department of Emergency Medicine, Mayo Clinic Rochester (DGG, TAL), Rochester, MN; Gilbert Program in Medical Simulation, Harvard Medical School and the Department of Emergency Medicine, Massachusetts General Hospital (EH, JAG), Boston, MA; New York Methodist Hospital (CYK), Brooklyn, NY; the Integrated Residency in Emergency Medicine at the University of Connecticut (TN), Farmington, CT; the Department of Emergency Medicine, Mount Sinai School of Medicine (YO), New York, NY; the Department of Emergency Medicine, Vanderbilt University Medical Center (KP), Nashville, TN; the STRATUS Center for Medical Simulation, Brigham and Women’s Hospital (Emergency Medicine), Harvard Medical School (CNP), Boston, MA; the Northwestern McGaw Simulation Network, Medical Education, Feinberg School of Medicine, Northwestern University, Division of Emergency Medicine, Evanston Northwestern Healthcare (JV), Evanston, IL; and the Division of Emergency Medicine, McGaw Medical Center of Northwestern University, Evanston Hospital (EW), Evanston, IL.
    Search for more papers by this author
  • on behalf of the SAEM Technology in Medical Education Committee and the Simulation Interest Group

    1. From the Department of Emergency Medicine, University of New Mexico (SM), Albuquerque, NM; the Department of Emergency Medicine, Wake Forest University (MTF), Winston-Salem, NC; the Department of Emergency Medicine, Mayo Clinic Rochester (DGG, TAL), Rochester, MN; Gilbert Program in Medical Simulation, Harvard Medical School and the Department of Emergency Medicine, Massachusetts General Hospital (EH, JAG), Boston, MA; New York Methodist Hospital (CYK), Brooklyn, NY; the Integrated Residency in Emergency Medicine at the University of Connecticut (TN), Farmington, CT; the Department of Emergency Medicine, Mount Sinai School of Medicine (YO), New York, NY; the Department of Emergency Medicine, Vanderbilt University Medical Center (KP), Nashville, TN; the STRATUS Center for Medical Simulation, Brigham and Women’s Hospital (Emergency Medicine), Harvard Medical School (CNP), Boston, MA; the Northwestern McGaw Simulation Network, Medical Education, Feinberg School of Medicine, Northwestern University, Division of Emergency Medicine, Evanston Northwestern Healthcare (JV), Evanston, IL; and the Division of Emergency Medicine, McGaw Medical Center of Northwestern University, Evanston Hospital (EW), Evanston, IL.
    Search for more papers by this author

Address for correspondence and reprints: Steve McLaughlin, MD; e-mail: smclaughlin@salud.unm.edu.

Abstract

Health care simulation includes a variety of educational techniques used to complement actual patient experiences with realistic yet artificial exercises. This field is rapidly growing and is widely used in emergency medicine (EM) graduate medical education (GME) programs. We describe the state of simulation in EM resident education, including its role in learning and assessment. The use of medical simulation in GME is increasing for a number of reasons, including the limitations of the 80-hour resident work week, patient dissatisfaction with being “practiced on,” a greater emphasis on patient safety, and the importance of early acquisition of complex clinical skills. Simulation-based assessment (SBA) is advancing to the point where it can revolutionize the way clinical competence is assessed in residency training programs. This article also discusses the design of simulation centers and the resources available for developing simulation programs in graduate EM education. The level of interest in these resources is evident by the numerous national EM organizations with internal working groups focusing on simulation. In the future, the health care system will likely follow the example of the airline industry, nuclear power plants, and the military, making rigorous simulation-based training and evaluation a routine part of education and practice.

The field of health care simulation is rapidly growing, and these techniques are widely used in emergency medicine (EM) graduate medical education (GME) programs. We describe the current state of simulation in EM resident education, including its role in learning and assessment. We focus on four main areas: simulation as a teaching tool, its role in assessment, faculty development, and guidance on developing a simulation program. We conclude with a list of resources and thoughts on the future of health care simulation.

History

Simulation is a technique used in health care education to replace or amplify real patient experiences with contrived scenarios designed to replicate real clinical encounters. These experiential learning sessions are designed to evoke or replicate substantial aspects of the real world in a fully interactive manner.1 In 1959, Marx et al.2 described the first MEDLINE-referenced use of a cardiovascular simulator for the evaluation of prosthetic aortic valves. The first mannequin-based systems were developed by Denson and Abrahamson at USC3 in the 1960s, and later by Gaba and DeAnda at Stanford.4 Gaba and DeAnda describe an anesthesia simulation to recreate the operating room environment that could be used for physician training and research.4 Using this model, anesthesiology educators pioneered the use of immersive simulation in GME. In the 20 years since Gaba and DeAnda’s original work, the technique of simulation has grown to encompass a variety of tools for providing an augmented learning experience (Table 1).

Table 1.   Types of Simulation Technology
  1. VR = virtual reality.

Mannequin-based simulatorsMannequin-based, or “high-fidelity,” simulators use sophisticated computer-driven electronic and pneumatic mannequins to provide health care professionals with realistic patients that breathe, respond to drugs, talk, and have vital sign outputs into the clinical monitoring equipment in the treatment room.
Partial or complex task trainersProvide a highly realistic yet focused experience for the learner and are designed for a specific procedure, such as central line placement, bronchoscopy, or airway management.
Screen-based computer simulatorsPrograms that run on personal computers or the Internet that allow learners to work through cases using clinical knowledge and critical decision-making skills.
Standardized patients“Actors” specifically trained to present their medical histories, simulate physical symptoms, and portray emotions as specified by each case.
VRVR is a simulated, immersive environment, created by a combination of computer based images and interface devices. A VR environment may include visual stimuli, sound, motion, and smell.

Some of the initial published descriptions of the use of simulation in EM education include a description of team training principles,5 a discussion of human responses to the simulation environment,6 and a description of a simulation-based medical education service.7 There has been tremendous growth in simulation within EM since 2003, highlighted by the creation of interest groups and committees focused on simulation within the major EM organizations. The Society for Simulation in Healthcare (SSH) was established in January 2004 as an umbrella organization for all specialties. Its mission is “. . . (to) lead in facilitating excellence in (multispecialty) health care education, practice, and research through simulation modalities.”8 SSH has approximately 2,000 members representing a multispecialty, international membership. The leadership and membership of SSH include numerous emergency physicians (EPs).

Recent Changes in GME

Initiated in 1999, the Accreditation Council for Graduate Medical Education (ACGME) Outcomes Project was the product of a collaborative effort between the ACGME and the American Board of Medical Specialties (ABMS). To ensure the quality of GME, one of its primary goals was to place, “increasing emphasis on educational outcomes in the accreditation of residency education programs.”9 The project is rooted in six general core competencies that can be applied across the spectrum of medical specialties. Each specialty and individual program is permitted to tailor the competencies to the learning requirements of their respective practice environments. There are four phases to this project with a general timeline of implementation. Residency programs are now in Phase 3, which includes the full integration of the competencies and their assessment into their training programs. Residencies are charged with employing resident performance data as the basis for promotion and using external measures to ensure that residents are performing at levels consistent with their educational objectives. Simulation is listed as one of the key assessment tools by the ACGME for a number of the core competencies.10

Simulation-Based Teaching in GME

The use of medical simulation in GME is increasing in part because of limitations of the 80-hour resident work week, patient dissatisfaction regarding being “practiced on,” a greater emphasis on patient safety, and the importance of early acquisition of complex skills before actual operative or procedural practice. While a few programs have transformed their residency curriculum to fully integrate medical simulation,11 most have employed simulation less comprehensively.

Over the past decade, there has been a major paradigm shift in the format of medical teaching, as exemplified by the incorporation of problem-based learning within our medical schools.12 GME has also shifted, out of necessity, from traditional apprenticeship to more directed clinical skills training.13 Medical information is growing exponentially, yet the length of medical training remains static. Furthermore, with work hour restrictions, it is unrealistic to expect that residents will have broad exposure to all necessary clinical entities before graduation. Simulation can bridge the gap between the classroom and bedside patient care by deliberately ensuring that all trainees are exposed to core clinical problems.

Using simulation, the teacher can apply many of the principles of adult learning by creating a risk-free environment for residents to learn practical material that is relevant to their day-to-day patient encounters.14 Instead of sitting in a lecture imagining how to intubate a patient, or learning on-the-job in the emergency department (ED) with a rapidly desaturating patient with chronic obstructive pulmonary disease, the learner can be taught the indications of airway management, practice procedural techniques, and receive immediate feedback, all in a controlled setting. This mode of teaching allows trainees to integrate and apply knowledge for clinical decision-making in immersive environments.

Simulation has been successfully used with emergency medical services (EMS) providers,15 nurses,16 medical students,17,18,19 resident physicians,19 practicing physicians,21 and even high school and college students.22 The anticipated learning goals for the exercise, the expectations for learner performance, and the subject matter for the simulation all depend on the intended audience. Once the target audience has been identified, the main subject matter and the learning goals for the session should be determined. These may center around clinical management principles, such as an approach to airway management, or a specific disease process like the management of congestive heart failure or acute myocardial infarction. Other objectives may include crisis resource management and team training,23,24 or basic science principles that underlie a clinical scenario.18,25 Secondary goals may include illustration of the clinical reasoning process, teamwork and communication skills, review of clinical algorithms such as advanced cardiac life support (ACLS),26 or demonstration of clinical skills and procedures.

Simulation allows learners the opportunity to practice critical, time-sensitive skills without risk to patient or learner. Patients do not want their own clinical care to be used as an opportunity for skill training. A patient preference survey showed that if given the choice, only 42% would let a medical student perform their first venipuncture on them, and only 7% would allow a first time-lumbar puncture. Approximately 50% of patients would never let a medical student perform a lumbar puncture, central line, or intubation on them at all.27 Skills such as ACLS can be taught to a trainee away from the distractions of the clinical environment and can allow time for rehearsal before application to a patient encounter. Instructional science research has shown that to ensure the acquisition and maintenance of skills at an expert level, the learner must engage upon deliberate practice of the desired educational outcome.28 Deliberate practice must consist of repetition and feedback in a structured environment with a rigorous skills assessment where the learning objective is appropriate for the level of the learner.29,30 Simulation is an ideal modality to allow deliberate practice in a wide variety of clinical scenarios, with opportunities for debriefing after the scenario.26 Debriefing sessions are important for the success of simulation experiences and may be most effective if presented in a structured format to discuss specific aspects important for participant learning.31,32

While simulation may not be a replacement for actual patient encounters, it can offer a much more protected opportunity for review and reflection of clinical care. Despite close supervision, some aspects of clinical care will always be carried out independently by learners. Some, such as the establishment of rapport, efficient history-taking, informed consent, and discharge instructions, are seldom observed by faculty. Simulation allows for the direct observation of these skills by educators, and allows learners to self-reflect and receive external feedback.33

Pervasive throughout simulation literature are surveys and feedback statements from learners that simulation is helpful and that learners typically enjoy it as a training modality. Any teaching adjunct that can inspire and engage a student should be strongly considered as part of a teacher’s repertoire. Through an in-depth review of simulation education research literature, Issenberg et al.34 suggest features of simulation that lead to effective learning. Some of these features include providing feedback during the learning experience, allowing repetitive practice, provide increasing levels of difficulty, creating clinical variation, and carefully controlling the environment.34 Issenberg et al.34 also discusses the importance of integrating simulation into the overall curriculum, providing team learning opportunities, and clearly defining benchmarks and outcomes. We next present an overview of some of the evidence to support the various simulation technologies as effective teaching methods.

Evidence

Like medicine, other industries that have utilized simulation are inherently complex and high stakes and rely on expert skill acquisition to reflexively deal with the unexpected. There is limited but growing evidence that simulation training or improved performance on a simulator can translate to improved overall patient care.35 Some of this limitation in evidence is due to the relatively new focus on simulation as an educational tool. In addition, many of the important questions in simulation-based education and assessment cannot be answered by traditional biomedical randomized controlled trials. Poor patient outcomes are relatively rare and rest on such an inherently complex and individualized set of factors, which power and feasibility issues often preclude the definitive study. Novel research techniques and those developed and used extensively in social and educational science will need to inform simulation research when important questions cannot be answered using the routine biomedical paradigm.

Many studies in the medical simulation literature focus on skills training. Surrogate endpoints, such as knowledge retention or subjective improvement, are used to show that simulation can be effective in teaching specific skills or protocols. Success, as rated by task performance, error reduction, reduced training time, or decreased response time, has also been used to quantify the efficacy of training. In one study, paramedic students trained in intubation on a simulator were equally able to intubate in the operating room as students trained in the operating room.36 Simulation training has also been shown to improve adherence to safety concerns in pediatric procedural sedation by nonanesthesiologists.37 One randomized controlled trial does suggest that fourth-year medical students performed better in simulated scenarios after training on the simulator than students who participated in problem-based learning sessions.38 A curriculum featuring deliberate practice dramatically increased the skills of residents in ACLS scenarios.26

Some of the most compelling analyses of real-world advantages of simulation training come from the surgical literature. As part of a “VR to OR” project, a randomized, controlled double-blinded trial found that residents trained using a moderate-fidelity virtual reality (VR) trainer for laparoscopic cholecystectomy were 29% faster in gallbladder dissection. Importantly, VR-trained subjects were nine times less likely to transiently falter and five times less likely to injure the gallbladder or burn nontarget tissue than their counterparts who received standard programmatic training. Mean errors were also six times less likely for the VR-trained group.39 Multiple studies have followed, showing that VR training improves performance on minimally invasive surgery and various endoscopies. A milestone in the progress of simulation-based training was the 2004 Food and Drug Administration mandate for VR training for carotid stent placement.40 Banks et al.41 also reported improved resident performance of laparoscopic tubal ligation compared to controls, when employing laparoscopic simulators in a obstetrics/gynecology residency. Other procedural simulations used in residency training include central venous access,42 cystoscopy,43 and airway management.44 One study of 45 fellows in gastroenterology training attempted to answer the question about the relationship of simulation training to clinical performance with actual colonoscopy.45 This randomized, controlled, blinded, multicenter trial compared 10 hours of colonoscopy simulator training to no training and found greater objective competency in the first 80 colonoscopy procedures on real patients. Interestingly, this increase in competency was only seen in the early phase of training, as the median number of cases to reach 90% competency was the same (160 patients) for both groups. This supports the idea that doing simulation training first may make additional training on real patients safer.

Crisis or crew resource management and teamwork skills have been another focus of simulation training. Many institutions are now instituting crew resource management courses or rapid response teams that train using high-fidelity simulation. Although there is little rigorous evidence, multiple survey studies report perceived improvement and successful real-life application of learned skills soon after completing the course. In two small controlled studies, team behaviors seemed to improve after simulator training.46,47

Although high-fidelity simulation in GME is becoming commonplace, superiority over other innovative and interactive teaching modalities, such as case-based learning,48 computer-based learning,49 or video-assisted modalities19 for scenario training is unclear. Simulation, like all educational techniques, needs to be matched to an appropriate set of learning objectives. Clearly there is a need for further research to validate the utility of simulation and its correlation to performance in a clinical setting and the overall care of the patient.

Examples in EM

A survey conducted in 2002 to 2003 revealed that 60% of EM training programs had either “no formal curriculum” or only the “initial development” of a simulation curriculum.20 In addition, less than half of the ACGME-approved EM residency programs possessed a high-fidelity mannequin-based simulation training center, and only 18% of programs with institutional simulation training centers used them to train EM residents.20 Over the past 5 years, however, use of simulation training in EM has grown tremendously, with over 80% of residency programs now using mannequin-simulations.49a

Of the programs offering simulation-based teaching, most have added select simulation modalities to their existing curriculum. A few programs have redesigned their educational curriculum to fully incorporate medical simulation. Binstadt et al.11 describe a revamped EM curriculum utilizing the full spectrum of simulation-based teaching. Computer-based cases, actors, and high-fidelity mannequins and an advanced skills laboratory with a variety of task trainers are incorporated. Traditional lectures and seminar-based teaching are reserved for content better suited to that modality. McLaughlin et al.50 describe a 3-year curriculum that involves 15 simulated patient encounters with a graduated complexity. The ACGME core competencies are incorporated into the cases with formative evaluations of the learners. Another simulation-based curriculum specifically addresses the systems-based practice ACGME core competency.51 In contrast to the prior examples, the EM residency at the Mayo Clinic has transitioned 20% of the core curriculum to simulation-based teaching without segregating junior and senior residents for the cases or debriefing sessions.52

Caring for multiple patients simultaneously is essential to the practice of EM and represents a particularly high-risk environment. Simulation scenarios with two or more simultaneous patients are being used to develop multitasking, crew resource management, and decision-making skills without risk to actual patients.53 High-fidelity simulation has even been used to replicate patient encounters during morbidity and mortality conferences.54 Internationally, simulation has been used to challenge the traditional techniques of medical education on an even larger scale. Ziv et al.55 describe a model for cultural change in medical education using simulation-based teaching.

Matching Learning Objectives to the Educational Approach

High-fidelity simulation-based teaching has been compared to a theatrical production, with the need for actors, props, scripts, and people to work behind the scenes.56 It can be very resource-intensive, which is often viewed as its primary limitation. However, simple or low-tech simulation approaches can often be just as effective.57 Binstadt et al.11 describe a “clinical performance pyramid” based on a hierarchical approach to teaching. At its base is knowledge, from which higher-level decision-making develops. Decision-making guides the choice of appropriate actions leading to a need for procedural competence. Finally, the highest level of competence is achieved when one functions successfully as a member of the health care team.11 Simulation may be an inefficient method for teaching simple facts in isolation, but is quite an effective tool for anchoring knowledge within the context of higher-level skills like decision-making, procedural skills, and teamwork training.56 Screen-based computer simulations58 or VR systems59 can effectively complement high-fidelity simulation. The goal of the educator is to choose the optimal educational method for the learning objectives they are trying to achieve. As training programs incorporate simulation-based education into their curriculum,5,24,50 they are working to define which of the ACGME core competencies are most amenable to simulation-based education.

“Patient care,” for example, is highly compatible with high-fidelity mannequin-based simulation, which offers a robust platform for performing a history and physical exam, creating a differential diagnosis, and initiating therapy. Crisis or crew resource management and teamwork skills can be learned and assessed through simulated patient encounters. “Communication and interpersonal skills” can be practiced through cases that incorporate elements, such as end-of-life discussions with family or dynamic team resuscitations. If learners are allowed to access a realistic practice environment as part of the simulated exercise, then they can also develop the competencies of “practice-based learning” and “systems-based practice.” Such resources may include the imagined ability to send a patient to the cardiac lab or real-time access to on-line medical support. Multiple patient scenarios can also present challenges in resource or systems management. Similarly, repetitive scenarios can allow instructors to assess students’ ability to incorporate previously learned knowledge. While “medical knowledge” has been historically learned through reading and attending formal lectures, anchoring this basic knowledge into medical decision-making may be enhanced through simulation. The ability to engage the student in complex dialogue allows for communication encounters similar to those developed as part of standardized actor-patient exercises. Therefore, even “professionalism” can be taught effectively using simulation.

One obvious and widely accepted use of medical simulation is in the area of procedural skill development. Public demand for patient safety is helping to drive the concept that health care providers should be competent with invasive procedural skills before live patient care. Task training with specialized units can allow procedures to be repetitively performed with no patient risk. Detailed instruction can occur throughout the initial learning process, allowing the trainee to explore each step and its potential complications. Repetition, linked with corrective feedback from an expert, can solidify a student’s competence. There are many task trainers being produced for specific procedures, such as ultrasound-guided central venous catheter insertion, airway management techniques, lumbar puncture, and vaginal delivery. As with carotid stent placement, it may soon become standard practice to require the demonstration of procedural competence and prove skill maintenance on a task trainer to acquire credentialing for actual patient care.

Simulation-Based Assessment in GME

Simulation-based assessment (SBA) has been used in healthcare since the introduction of the Objective Structured Clinical Examination (OSCE) with simulated actor-patients in the early 1980s. High-fidelity mannequin-based simulation now offers the potential to assess learners on more diverse and complex aspects of clinical care. There are numerous potential applications for SBA in EM. As the ACGME Outcomes Project moves into its third phase, full integration, SBA offers promise as a tool for objectively assessing some of the competencies that are more difficult to evaluate via traditional means.10 As alluded to earlier, a module that requires a resident to discuss advanced directives with the family member of a critically ill patient would allow assessment of professionalism, interpersonal and communication skills, systems-based practice, and patient care. As part of a crisis resource management course, Gisondi et al.60 assessed management of ethical dilemmas by trainees. Their performance assessment tool was able to discriminate between experienced and inexperienced residents using several elements of professionalism. Although staffing of EDs by faculty offers an opportunity to observe behaviors more closely than in many other specialties, important aspects of care are not routinely observed. SBA offers the added benefit of allowing the learner to develop and implement his or her own plans without the need for faculty intervention to ensure real-time patient safety.

Evidence

Simulation-based assessment using modern tools and techniques has the potential to revolutionize the manner in which competence is assessed and may serve as a critical tool to accomplish the long-term objectives of the ACGME Outcomes Project. Although the studies validating assessment tools for use in SBA in EM are increasing,60–63 the majority of the work to date has addressed graduate and undergraduate learners in anesthesiology as well as the surgical and procedural specialties.60,61,63–74 To move forward, much more work must be done to validate these tools for EM. Simulation training has been demonstrated to be a useful tool for skill assessment and training. A recent study of physicians in a pediatric training program found that high-fidelity medical simulation can assess a resident’s ability to manage a pediatric airway.75 Assessment of skill development for managing shoulder dystocia found that training with mannequins improved physician and midwife management of simulated dystocia.76,77 Interestingly, both traditional low-fidelity training and computerized high-fidelity simulation were effective in producing some improvement. The challenge remains to determine whether or not these increases in simulated performance translate into improvements in real patient outcomes. Teams of resident physicians from multiple specialties (medicine, surgery, and anesthesiology) were assessed in another study for their ability to follow practice guidelines in the management of sepsis.78 This retrospective review of videotapes identified adequate and inadequate levels of performance in the simulated scenario based on established guidelines. The use of these consensus practice guidelines as a benchmark for performance assessment demonstrates one approach to standardized assessment when customized metrics have not been developed or validated.

At a minimum, simulation assessments should reliably be able to discriminate between novice and experienced clinicians. Evaluation tools previously developed for EM oral examinations appear to retain the ability to discriminate among skill levels when used in a simulator-based testing enviromment.62 Crisis resource management in critically ill patients was assessed in 60 residents using a novel rating scale and found significant differences between first-year and third-year residents.23 Another study of 54 residents in a pediatric training program found that simulation can reliably measure and discriminate competence.64 Expert consensus was used to develop and validate four simulation case scenarios, and three of the four cases demonstrated statistically and educationally significant differences between first- and second-year residents. In a study of 44 EM residents who were tested on a patient care competency using time-based goals for decision-making, significant differences were found between novice and experienced resident physicians.61

An alternative approach to evaluate clinical competency was demonstrated in a study of internal medicine residents learning ACLS, who were required to achieve mastery of the skill set.26 They assessed second-year residents in proficiency with ACLS scenarios and required residents who did not achieve competency after initial training to continue with additional practice time until competency was achieved. While residents required differing amounts of training time to achieve acceptable results, all 41 residents ultimately met the mastery learning goals. For essential components of clinical patient care, such as resuscitation algorithms, these assessment methods may be a step toward ensuring and documenting learner competence. Guidelines for training can then be geared toward outcomes (e.g., competence) rather than processes (e.g., course completion).

Critical to ensuring that assessments fairly and accurately evaluate skills are the scoring rubrics used to determine competence. Studies vary on the superiority of checklist (CL) or global rating scale (GRS)-based scoring models.63,65,67,72,74 While CLs seem to be particularly useful at rating technical actions, GRSs seem to be better for assessing complex behaviors and decision-making.72–74 In cases where successful management hinges not only on performance of specific tasks, but also the order in which those tasks are completed, GRSs are better able to capture the aggregate performance.79 Critical action-based global ratings, akin to those used on the American Board of Emergency Medicine Oral Certification examination, may combine aspects of both CL- and GR-based ratings. This hybrid approach provides concrete guidance when rating complex tasks and offers specific criteria for various performance levels.68,71,73,74

Also critical to accurate assessment is the number of observations that are necessary to reliably rate learners. Most studies have found good correlation between independent raters assessing individual performances. Although one study suggested that at least two raters were required to provide reliable ratings, this study also was one of the few that suggested that behavioral ratings were significantly less reliable than CLs.63 This raises the possibility that raters disagreed because criteria were not established for various ratings a priori. Most other anesthesia studies found that there was little gained in terms of reliability by adding more than one rater.66,67,69,74 Most studies have found that learners may perform well on one case but poorly on other cases when being assessed on the management of several cases.66–69,73 As noted above, internal consistency may be an unrealistic goal. Murray found that 6–8 performance samples yielded moderately reliable scores, but felt that more samples would further increase reliability.71,73 The optimal number of samples required for EM assessment is probably in the range of 6–12 cases.69 The American Board of Emergency Medicine uses a total of seven patient encounters, five single and two multiple, for its high-stakes oral exam.80 Trainees must not be penalized for poor performance on a single case, because performance in one scenario is not a good predictor of performance in another.69 This suggests that it is important to ensure that characteristics pertinent to a specific case not be generalized to a subject’s performance at large. Unfortunately, the feasibility of SBA is inversely proportional to the number of cases, time, and resources necessary to perform the assessment. Development of highly reliable cases is difficult and time-consuming.64 It will therefore be critical determine the minimal number of cases required to provide accurate data.

Matching Assessment Objectives to the Evaluation Approach

Opportunities for SBA to satisfy specific requirements of the Residency Review Committee for Emergency Medicine (RRC-EM) include but are not limited to satisfying the guidelines for patient evaluation, resuscitation, and procedural care.

Chief Complaint Competency.  By having a series of cases that demonstrate a range of pathology for a given chief complaint, it is possible to assess a resident’s ability to obtain data, develop a differential diagnosis, interpret diagnostic studies, and develop treatment plans for a range of conditions related to a given chief complaint.

Resuscitation Competency.  Simulation offers the opportunity to allow a resident to manage a resuscitation from start to finish without the need for intervention on the part of supervising faculty. This allows one to introduce assessment of critical thinking and management strategies that are typically unavailable. Simulated environments also offer an opportunity to assess the nonmedical skills required to optimally lead a resuscitation, such as prioritization, communication, and team management.

Procedural Competency.  The development of realistic procedural simulators may allow for the assessment of resident competency in commonly performed EM procedures.

Annual Competency Assessment.  By controlling the cases being assessed, programs can define expectations for learners based on their year of training to assess readiness for year-to-year progression. This can be done by defining minimal expectations for the management of a given case, based on year of training. Alternatively, one could introduce more complex problems to advanced trainees. Although SBA is potentially well suited to such high-stakes assessment, until validated, it should only be used as one tool to gauge readiness to progress and must be used in conjunction with other established modalities.

In its current form, SBA can be used effectively for formative or summative assessment. When used formatively, SBA can help provide a medium by which faculty can objectively identify areas in which a learner is particularly weak or strong. If videotaped, this can be particularly powerful by allowing the learner to self-identify behaviors and develop strategies to optimize performance. When used for formative feedback where the stakes are low and the goal is to improve performance, the content of the session is more important than the structure. In contrast, when used summatively, the logistic requirements are more stringent, particularly if stakes are high. Before SBA can be incorporated into high-stakes examinations, several criteria must be met to ensure its validity.63–66,81 Specifically, the tool must be reproducible, reliable, valid, and feasible.

Before developing a SBA, the purpose of the activity must be clearly defined.81 An assessment of minimal competency for credentialing would be expected to have tightly defined performance standards with specific competency requirements. In Israel, high-fidelity simulation has been incorporated into the anesthesiology board certification process.67 Because of the purpose of such an examination, the pass/fail standard must understandably be set at a level that sets the minimal bar for performance. Meanwhile, sessions designed for formative feedback may incorporate behavioral ratings indicative of a wider range of performance levels to allow for the identification of characteristics demonstrating excellence, in addition to those indicative of minimal performance standards. Feedback would be aimed at remediating suboptimal behaviors and identifying and reinforcing positive ones.

Faculty Training in Simulation-Based Education

Emergency medicine faculty should be familiar with the advantages and disadvantages of simulation, just like any other educational tool. While some faculty may naturally transition from bedside teaching to simulator-based instruction, dedicated training in simulation as a teaching tool is becoming more common in faculty development courses.82 Expertise in this area will become even more important for the current and future generations of EM educators, as simulation becomes a more prevalent part of our residency programs.

Simulation in the absence of a carefully thought-out scenario with defined educational goals is rarely as effective as a well-planned and well-executed event.34 Faculty who are interested in providing educational sessions for physicians in residency training must, therefore, be familiar with an approach to creating stimulating and appropriate experiences using high-fidelity simulation. General educational competencies such as objective writing, feedback, and assessment fully apply to simulation-based activities, but alone are not adequate preparation. Faculty who lead teaching initiatives or who are involved in simulation-based assessment will usually require additional focused training with these techniques. This training is available through a growing number of courses that are listed in the resources section of this article. The training can also be provided at the institutional level if local expertise is available. Some elements important to include in faculty education programs in simulation are detailed below.

As a foundation, simulation faculty leadership should have knowledge and skills in adult learning theory,83 objective writing,84 and curriculum design.85 They also need expertise in the clinical content area. The second level of educational expertise in simulation includes basic skills in assessment, such as knowing the advantages and disadvantages of various performance assessment tools and being able to match learner level to objectives and to teaching method. Experience with standard setting and giving feedback is also important. The third level of expertise would include simulation-specific skills such as scenario design, debriefing of high fidelity mannequin-based simulation and some technical knowledge about simulator operation, capabilities, limitations, and programming. In the “Train the Trainer” course at the University of New Mexico (UNM), these three levels of skills are taught in four half-day sessions including 16 total hours of contact time. One area of particular focus is that of scenario design. UNM’s faculty development course86 recommends an eight-step technique for developing simulation scenarios, as well as the use of a standardized reporting format. The eight steps of simulation design are listed in Table 2. Many EPs have also received training at the Institute for Medical Simulation, an intensive faculty development program sponsored by the Center for Medical Simulation in collaboration with the Harvard–MIT Division of Health Sciences and Technology.

Table 2.   The Eight Steps of Scenario Design (from S.A. McLaughlin)
1. OBJECTIVES: Create learning/assessment objectives.
2. LEARNERS: Incorporate background/needs of learners.
3. PATIENT: Create a patient vignette to meet objectives that also must elicit the performance you want to observe.
4. FLOW: Develop flow of simulation scenario including initial parameters, planned events/transitions, and response to anticipated interventions.
5. ENVIRONMENT: Design room, props, and script and determine simulator requirements.
6. ASSESSMENT: Develop assessment tools and methods.
7. DEBRIEFING: Determine debriefing issues and mislearning opportunities.
8. DEBUGGING: Test the scenario, equipment, learner responses, timing, assessment tools, and methods through extensive pilot testing.

In summary, training faculty to be competent users of simulation-based technology requires time and is critical to the success of the program. There are a number of resources available for faculty development in simulation. Simulation skills should be seen as a core competency for education faculty, along with the more traditional teaching techniques.

Resident Training in Simulation-Based Education

The major educational organizations in medicine have identified residents’ roles as teachers as an important part of the educational environment. The Liaison Committee on Medical Education (LCME) states that “Residents must be fully informed about the educational objectives of the clerkships and be prepared for their role as teachers and evaluators of medical students.”87 The core competencies from the ACGME list the following objective under practice-based learning and improvement: “Residents must be able to facilitate the learning of students and other health care professionals.” Finally, Morrison et al.88 states that “. . . in the GME Core Curriculum of the AAMC, residents’ teaching skills are vitally important, particularly for those residents who teach third-year medical students in the ‘core’ clinical clerkships.”88 Data indicate that residents spent 20%–25% of their time supervising, teaching, and evaluating medical students and other residents.89–92 From the perspective of the students, residents have a significant role as teachers in their first clinical year,93 and they estimate that up to 30% of the teaching they receive is from residents.94 Simulation-based teaching is becoming an integral part of the educational skill set that we expect residents to obtain during their training. As the utilization of simulation expands, some programs have begun to involve residents as simulation facilitators. The Harvard Affiliated Emergency Medicine Residency at Brigham and Women’s Hospital and Massachusetts General Hospital has incorporated simulation as a component of the senior resident “teaching” rotation. Residents work closely with simulation faculty and staff to develop training modules for rotating medical students and junior residents. This program offers a more robust simulation component for medical student education, at the same time providing theoretical, technical, and instructional experience to the residents. It is clear that residents are critical members of our teaching teams, and as simulation becomes a more prevalent teaching method they will need skills in how to teach with this modality.

Fellowship Training in Medical Simulation

For those residents who want preparation for academic careers in medical simulation, fellowship training is now becoming more widely available. Currently, simulation fellowship programs are 1 to 2 years in length at institutions with well-developed simulation programs. The fellowship experience can focus on a variety of simulation topic areas depending on the area of interest, typically including medical education, educational research, patient safety, program administration, or technology development. These fellowship programs can include master’s degree coursework or other faculty development courses in the relevant topics.

Emergency medicine is well suited for simulation fellowship opportunities. As an example, newly graduated EPs have had opportunities to train as fellows in Harvard Medical School’s simulation program since 2003, a position administered in collaboration with the Department of Emergency Medicine at Massachusetts General Hospital. Many departments of EM already offer fellowships in medical education or research, and a simulation fellowship can be constructed and offered as a natural extension of such efforts; more such fellowships are being offered every year. Because of the interdisciplinary nature of simulation, mentors and collaborators can be found from across departments in the institution. Funding for the fellow is typically provided by a combination of sources, including the fellow’s part-time clinical work, departmental research and development funds, dedicated fellowship stipends, extramural grants, and institutional budgets.

Developing a Simulation Center

The number of simulation centers continues to increase, as has the body of experience on how to best develop a simulation program. At the same time, the relevant technologies have become more compact, and the increased use of wireless devices has improved usability. This combination of increased experience and new technologies has revolutionized simulation education spaces. It is essential that the simulation location and space fit the goals of the program. It is not entirely clear that a fixed center is the best solution for all programs. Developing a high-fidelity simulation laboratory requires significant upfront costs for equipment. Over the long term, however, even greater expenses are needed for faculty, actors, content development, technical professionals, and administrative staff.95 Several active efforts are under way by seasoned simulation educators that focus on in situ simulation. These projects use simple storage space to house their devices, which are then deployed in clinical or educational areas on demand. Doing scenarios in nontraditional locations, such as the back of an ambulance or in a medical helicopter,96 may also be highly effective. Large groups can experience and benefit from simulation in a lecture hall setting with some creative modifications to cases and presentation formats in both live18 and prerecorded54 simulation scenarios.

If we are following the best science for the effective use of simulation,19,97 our simulation events will create optimum conditions for deliberate practice and for feedback. These should be the guiding principles as a simulation center is planned. The location of the center, the rooms, the layout, and the equipment will follow, and the specific local goals will further fine-tune the design effort. Location is a paramount consideration. If the facility is not located in close proximity to where the trainees and educators typically spend their time, a program may be severely limited.98

A simulation center generally has four main areas: the simulation area, the control room, the debriefing area, and an area for the storage of equipment. These four spaces should be laid out so that participants and those touring the facility may easily pass between rooms. Facilities with an observation area for additional participants to watch and critique performance during an ongoing simulation can be effective in allowing a large group to participate by observing a smaller group of learners at the bedside. The simulation area (or stage) will require sufficient sound-proofing such that ambient noise and discussion in adjacent rooms does not penetrate, while being acoustically friendly to recording devices essential to debriefing sessions. It is recommended that a line of direct sight be present between the control area and the simulation area, usually via one-way glass. Even though a center may be equipped with video recording and playback, this line-of-sight will assist the simulation director when the action of the participants blocks the camera’s view. The number and type of cables that pass between the control and the simulation area are variable, but it is sufficient to say that despite the prevalence of wireless controls and audiovideo devices, the numbers of cables should not be underestimated. The audiovisual support that these cables bring to the debriefing area is a nearly essential component for simulation and facilitates one of the pillars of simulation education: feedback.

An excellent resource has recently become available to those interested in planning a center. Kyle and Murray99 have published a compendium of simulation resources in their text, Clinical Simulation, Operations, Engineering, and Management. This volume includes the plans and schematics of several centers of various sizes. The common central elements remain that the best centers are designed to support the educational goals of its faculty.

Resources

Over the past several years, a number of resources have become available to help develop programs in simulation for GME. In addition, most of the major EM organizations have created internal working groups on simulation. The Society for Academic Emergency Medicine (SAEM) has a new standing committee called the SAEM Technology in Medical Education Committee, which currently focuses on simulation-related activity as directed by the SAEM board of directors. In addition, SAEM has a Simulation Interest Group, which has a member-driven agenda. Finally, SAEM provides a simulation consult service for a nominal fee, which will send experienced faculty to institutions to assist with the development of simulation programs. All of these resources can be found on the Web site for SAEM under the education section: http://www.saem.org/saemdnn/Education/Simulation/tabid/73/Default.aspx.

The Harvard–MIT affiliated Center for Medical Simulation (CMS) offers one of the most complete courses to train faculty in using simulation. More information can be found at http://www.harvardmedsim.org/. The CMS course is well known for its rigorous approach and development of a solid foundation in the underlying science of simulation. The American College of Emergency Physicians (ACEP) offers both a basic and an advanced teaching fellowship for faculty who are interested in becoming better educators. The ACEP Advanced Teaching Fellowship includes one of the only hands-on faculty development courses in the country for simulation. Additional information is available from the ACEP Web site: http://www.acep.org/cme.aspx?id=22382.

The SSH, based in the United States, is the largest multispecialty organization in the world dedicated to the science of simulation. The SSH holds an annual meeting and has an active EM special interest group. More information is available at http://www.ssih.org/public/. SSH sponsors Simulation in Healthcare, the first peer-reviewed academic journal dedicated to the field. It was launched in January 2006 by Lippincott, Williams & Wilkins. In Europe, The Society in Europe for Simulation Applied to Medicine (SESAM) is the major simulation organization. SESAM was organized in 1994 and has close ties with SSH. Both organizations share the journal Simulation in Healthcare. Information on SESAM is available at http://www.sesam-web.org/sesam_home.html.

There are a number of excellent simulation Web sites supported by academic and private institutions around the country. Links to many of these can be accessed from the simulation section of the SAEM Web site.

The SAEM Simulation Case Bank is an excellent resource to help develop simulation scenarios for GME (http://www.emedu.org/simlibrary). Faculty are encouraged to use this standardized case format to facilitate sharing of cases between institutions. Published cases are also available via this online resource for faculty interested in using materials that have been developed and tested at other institutions. Opportunities are also available for peer review of these types of emergency simulation materials via the AAMC’s online MedEdPORTAL (http://www.aamc.org/mededportal), which collaboratively sponsors the SAEM-AAMC Simulation Case Collection. Faculty are encouraged to post their own teaching materials via these mechanisms to disseminate their scholarly products and share simulation cases with faculty at other centers.

The Future of Medical Simulation

In the future, a safer health care system will likely follow the example of the airline industry, nuclear power plants, and the military—all making rigorous simulation-based training and evaluation a routine part of education and practice.

Imagine a health care system where it will be uncommon to practice on a patient before practicing on a simulator. Proponents will argue for such an approach as both an ethical100 and a regulatory imperative. Some accreditation bodies67 and hospital committees101 are already beginning to favor simulation-based demonstration of skill, and some insurance companies offer premium incentives for simulation training.102 That may become the norm. The government has already begun to experiment with a future where simulation is a component of both regulatory and reimbursement considerations.40

In the future, deans and divisions of medical simulation within schools and hospitals may be much more commonplace. EM as a specialty can and should play a important role in helping explore the field as a collaborative platform for integrating traditional and modern teaching techniques across health care. Imagine a learning environment where immersive simulation is easily compatible across a broad range of technologies and approaches, where costs decline and fidelity improves as industry, academia, and government all move to advance the field.

Given the excitement that surrounds medical simulation, the topic might even help catalyze innovation in the life sciences, much like the Moon shot helped invigorate the field of engineering.22 Legislative initiatives like a bill currently before the U.S. House of Representatives, “Enhancing Safety in Medicine Utilizing Leading Advanced Simulation Technologies to Improve Outcomes Now (SIMULATION) Act of 2007,”103 may assist in building federal infrastructure to help support the field.

Ancillary