SEARCH

SEARCH BY CITATION

“Medical Simulation Bill before Congress; National Consensus Conference to Identify Key Priority Areas . . .” (Society for Academic Emergency Medicine (SAEM) press release, May 28, 2008)

Together with colleagues across health care, emergency medicine (EM) has played an important national role in the development of medical simulation as an academic discipline, a field in which realistic artificial environments are used to practice medical skills, procedures, and protocols. In fact, the field is currently the subject of a bill before Congress designed to enhance federal support for simulation initiatives nationwide.1 Across the country, growth in the use of high-fidelity mannequin simulators among EM residency programs increased from 29% to 85% over the past 5 years;2 entire residency curricula at some programs are now structured around simulation.3 Given this extraordinary growth, the 2008 Academic Emergency Medicine (AEM) Consensus Conference, “The Science of Simulation in Healthcare: Defining and Developing Clinical Expertise,” was organized to help define a national research agenda for maximizing effective use of simulation across undergraduate, graduate, and continuous medical education. Because EM operates at the intersection of multiple specialties, we hoped the conference would be of multidisciplinary interest.

Background

  1. Top of page
  2. Background
  3. Conference planning and logistics
  4. Conference structure and content
  5. Plenary sessions
  6. Research questions and directions
  7. Summary
  8. Acknowledgments
  9. References
  10. Appendices

In 1999, the Institute of Medicine’s report “To Err is Human”4 identified patient simulation as an opportunity for enhancing medical safety in the same way that flight simulation is used to enhance quality in aviation. In 2002, a Simulation Interest Group formed within the Society for Academic Emergency Medicine (SAEM) to explore EM’s use of dynamic simulation technology. At that time, the focus was primarily on the use of sophisticated robot-mannequins to train anesthesiologists in operative crisis management. An international Society for Simulation in Healthcare was formed in 2004 with input from the SAEM Interest Group leadership, including membership on the new society’s board of directors. In 2005, the SAEM Board of Directors convened a Simulation Task Force to intensify SAEM exploration in this area, which by then had attracted the attention of federal funding agencies, as well as legislative and regulatory bodies. In 2007, the Task Force became a standing committee focusing on technology in medical education.

The field of simulation within SAEM is now a unified effort of both the original interest group and the task force/committee structure, which maintains an informational website, a case bank in collaboration with the Association of American Medical Colleges (AAMC) MedEdPORTAL, a biannual newsletter, and a consultation service to assist academic development in the field. The SAEM simulation groups have worked to collaborate with other EM groups on the topic, including the American College of Emergency Physicians (ACEP), the American Academy of Emergency Medicine (AAEM), and the EM Council of Residency Directors (CORD). These efforts complement dedicated simulation initiatives that have also emerged as part of other medical and surgical specialty society agendas, including those of the American College of Surgeons and the American Society of Anesthesiologists.

Conference planning and logistics

  1. Top of page
  2. Background
  3. Conference planning and logistics
  4. Conference structure and content
  5. Plenary sessions
  6. Research questions and directions
  7. Summary
  8. Acknowledgments
  9. References
  10. Appendices

After publishing an initial review and assessment of research opportunities in the field,5 the SAEM Simulation Task Force proposed simulation as the special topic for the annual Consensus Conference sponsored by the editors of AEM. The topic was selected in a competitive process; the project required 12 months of planning with the assistance of both an AEM Consensus Conference Planning Committee and an expert Faculty Advisory Group (see Appendix A for full listings). The end result was the full-day conference upon which this special issue of the Journal is based, held in Washington, DC, on May 28, 2008.

Approximately 325 individuals attended the event (see listing of registrants included later in this issue), and nearly 40 original manuscript submissions competed for a spot in this issue; both were a record for an AEM Consensus Conference venue and represented our hope that the diverse scope of EM practice would enable the work to apply broadly across specialties. Financial support for the conference was provided by: the Agency for Healthcare Research and Quality (AHRQ), the Josiah Macy, Jr. Foundation, the AAMC’s MedEdPORTAL, the Risk Management Foundation of the Harvard Medical Institutions, and over 30 medical organizations and academic departments nationwide, along with unrestricted educational grants from major simulator manufacturers (see full listing in Appendix B). The program was also endorsed by the Society for Simulation in Healthcare, ACEP, and AAEM; the President of the American Board of Medical Specialties (ABMS) began the day with a welcoming keynote.

Conference structure and content

  1. Top of page
  2. Background
  3. Conference planning and logistics
  4. Conference structure and content
  5. Plenary sessions
  6. Research questions and directions
  7. Summary
  8. Acknowledgments
  9. References
  10. Appendices

An expert panel of cognitive scientists and educators was recruited to serve as keynotes to help guide our deliberations (see Appendix A for full listing). The morning was dedicated to discussing how simulation can help develop expertise (“teaching” through deliberate practice sessions), and the afternoon was devoted to discussing how simulation can help define expertise (“testing” through formative and summative assessment modalities/metrics). The lunch sessions explored training and transference to the “real world,” focusing on simulation-based approaches to team training.

After both morning and afternoon keynotes, participants broke out into one of four 90-minute “consensus discussion groups” representing four domains of medical expertise:

  • • 
    Individual/cognitive expertise: global provider competency (Consensus Track 1);
  • • 
    Group expertise: effective teamwork and communication (Consensus Track 2);
  • • 
    Technical expertise: procedural and surgical skill (Consensus Track 3);
  • • 
    Systems expertise: effective simulation at the organizational level (Consensus Track 4).

Breakout groups ranged from approximately 20 to 130 participants depending on individual interest (see attendance detail below); an attendee could participate in different tracks between morning and afternoon sessions. Each session was audio-recorded as a post hoc aide to the session leaders in preparation of their proceedings papers (included in this Journal issue) or as a method to help breakout leaders review and define consensus items for their topic report (also included in this issue).

To facilitate consensus discussion, conference organizers and participants were also invited to post discussion items on a shared website both before and after the conference (http://www.patientsimulation.net/). The conference venue was equipped with wireless Internet access so that participants could e-mail questions and comments to the session leaders in real time and in follow-up after the session. This was one way of encouraging diverse communication in addition to routine oral discussion and debate, which ranged from “town hall”–style meetings to panel presentations with audience participation. Each consensus paper reflects significant review and consideration among individual writing teams who formed their own consensus prior to the conference (up to 10+ authors per writing group), which was then refined and complemented by the larger on-site discussion.

Plenary sessions

  1. Top of page
  2. Background
  3. Conference planning and logistics
  4. Conference structure and content
  5. Plenary sessions
  6. Research questions and directions
  7. Summary
  8. Acknowledgments
  9. References
  10. Appendices

The conference began with special remarks by Kevin Weiss, MD, MPH, President and CEO of the American Board of Medical Specialties, who spoke on the role of simulation within the board certification process. He suggested a significant potential for simulation to impact medical training and certification across disciplines, in the near future.

K. Anders Ericsson, PhD, next outlined the theory of expert performance, describing “deliberate practice” as the key ingredient for expertise development in any high-performance domain, including medicine. William McGaghie, PhD, then described how simulation equated to deliberate practice in medicine and outlined an approach to simulation-based educational research, citing a current body of work supporting the effectiveness of simulation teaching modalities.

The lunch session began with Eduardo Salas, PhD, who outlined the empirical basis for team training and described simulation-based approaches in medicine. Robert Hanscom, JD, then described the real-world application of simulation-based team training within a large health system, explaining the positive actuarial experience of a medical malpractice insurer who offers incentives for simulation training.

The afternoon session began with remarks by Jenny Rudolph, PhD, who described the role of simulation-based debriefing as a formative assessment tool. Jack Boulet, PhD, then described the promise of simulation for high-stakes summative evaluation and discussed the psychometric and logistical issues of using technology-enhanced simulation for board certification; his remarks brought us full-circle back to Dr. Weiss’s introductory comments.

In addition to the plenary papers included in this issue of the Journal, the presentations themselves can be found online at http://www.patientsimulation.net/.

Research questions and directions

  1. Top of page
  2. Background
  3. Conference planning and logistics
  4. Conference structure and content
  5. Plenary sessions
  6. Research questions and directions
  7. Summary
  8. Acknowledgments
  9. References
  10. Appendices

The overriding question upon which the consensus groups deliberated was: “What are the most effective approaches to simulation-based education (morning sessions) and evaluation (afternoon sessions) in undergraduate, graduate, and continuing medical education?” The charge to each group was to review the current literature, to identify high-priority gaps in current knowledge, and to outline research strategies to answer the pressing questions in the field. Below is an edited version of key questions and research directions that emerged from each consensus session, along with a notation of how many individuals participated in on-site consensus building; morning (A) sessions are paired with the afternoon (B) sessions in the same expertise domain.

Consensus Track 1—Individual/Cognitive Expertise: Global Provider Competency

Track 1A—Education (William Bond, MD, Group Leader [∼130 Discussants]).  Dr. Bond’s group identified the following broad areas of hypothesis testing as research priorities in the development of individual expertise:

  • 1
    How can simulation help identify expert behavior?
  • 2
    Can simulation produce more competent physicians? In a shorter time frame?
  • 3
    What is the optimal teaching and debriefing strategy for simulation cases?
  • 4
    Can simulation be used to diagnose learning deficits and performance problems?
  • 5
    Can simulation be effectively used as a remediation tool?
  • 6
    Can we prove that transfer of learning to the real environment has occurred?

This group explored the role of intuitive versus analytical thought in individual cognition and discussed how the deconstruction of complex tasks can be facilitated in the simulation lab. Simulation not only provides opportunities for deliberate practice and reflection, it also allows for the presentation and study of a diversity of case material and learning problems. Dr. Bond’s team posed numerous questions for future research and provided an extensive review of what is known about the development of individual expertise. They conclude that diverse collaboration between academic communities including medicine, cognitive psychology, and education will be required to illuminate, refine, and test hypotheses in this area.

Track 1B—Assessment (Linda Spillane, MD, Group Leader [∼50 Discussants]).  Dr. Spillane’s breakout group considered the following questions in exploring a research agenda focused on the assessment of individual expertise:

  • 1
    What competencies can/should be assessed using simulation-based assessment?
  • 2
    How should we assess performance?
  • 3
    What factors may threaten the validity of simulation-based assessment?
  • 4
    Does performance on simulated patients accurately reflect the quality of care provided to actual patients and how can this be assessed?
  • 5
    How often should practicing physicians be evaluated, and does simulation-based assessment have a role in continuing assessment and credentialing for practicing physicians?

The research agenda proposed by the group seeks to prioritize the competency domains best addressed by simulation-based assessment in terms of validity, reliability, and practicality, and to explore how those domains are optimally measured. The group identified the need to develop rigorous rater training programs, explore the impact of lab fidelity on scoring systems, and identify quality-of-care benchmarks that may link simulator-based performance to real-world clinical care. The group concluded that EM is well positioned to conduct research into the use of simulation-based assessment for high-stakes testing across competency domains.

Consensus Track 2—Group Expertise: Effective Teamwork and Communication

Track 2A—Education (Rosemarie Fernandez, MD, Group Leader with Paul Phrampus, MD [∼80 Discussants]).  As a method of grounding future research in teamwork, Drs. Fernandez and Phrampus’s group proposed use of a standardized taxonomy of team competencies (see separate article in this issue). The group discussed the following key questions:

  • 1
    What team-oriented competencies are most relevant to EM practice?
  • 2
    How important is leadership training for EM physicians?
  • 3
    What components are necessary for an effective team training program?
  • 4
    What are the biggest challenges to implementing team training programs in EM?
  • 5
    What is the optimal approach to debriefing and the provision of feedback?
  • 6
    What kind of simulation technology is most effective in team training programs?

Future work to define and teach optimal team leadership skills in EM was considered essential. Individual initiatives should be tailored to a local needs analysis and grounded in sound instructional methodology and debriefing techniques. The fidelity of the simulated experience should be customized to each set of learners and objectives in a manner that fosters measurable improvement and transfer to the clinical environment.

Track 2B—Assessment (Marc Shapiro, MD, Group Leader [∼80 Discussants]).  Dr. Shapiro’s group discussed the following questions in considering the measurement of group expertise:

  • 1
    How can simulation exercises diagnose team-based weaknesses and strengths?
  • 2
    What are the critical principles for simulation-based team training that will improve clinical performance?
  • 3
    Is there a “criterion standard” team performance metric? How good are existing metrics?
  • 4
    How does one create simulation scenarios to measure leadership behaviors?

The group emphasized that team competencies, just like individual competencies, must be well-defined to be measured. Simulation scenarios should be designed to produce triggers for team behaviors; in turn, meaningful team performance metrics will then guide effective feedback. The group also embraced a standardized taxonomy and agreed that leadership training should be a priority area for EM-based simulation and team training.

Consensus Track 3—Technical Expertise: Procedural and Surgical Skill

Track 3A—Education (Ernest Wang, MD, Group Leader [∼60 Discussants]).  Dr. Wang posed his questions within the framework of the Core Content of EM and focused on the list of procedural skills within the scope of EM practice. His group identified the following key questions for future research in developing procedural expertise:

  • 1
    How much training is enough?
  • 2
    What is the ideal balance of part versus whole practice?
  • 3
    What is the ideal balance of block practice versus distributive practice?
  • 4
    What instructional methods will best limit skill decay for specific procedures?
  • 5
    How often must procedures be practiced, once mastered, to limit skill decay? What is the retention interval for different procedures?
  • 6
    Does the complexity of the procedure influence the rapidity of skill decay?
  • 7
    Is overlearning necessary?
  • 8
    Does proficiency of a task trainer translate to the clinical setting?
  • 9
    Is mastery necessarily achievable in a 3- to 4-year training program or is a minimum acceptable level of performance more realistic?

Consensus discussion focused on the simulation platforms that currently exist, and the literature available, to support various training techniques. It was acknowledged that individual procedural studies may not be generalizable, given the uniqueness of individual tasks and trainers, and that virtual reality simulation has been most successful for screen-based procedures. The group recommended a programmatic focus on inherently high-risk procedures and important low-frequency procedures, with explicit emphasis on relevant instructional theory.

Track 3B—Assessment (Richard Lammers, MD, Group Leader [∼70 Discussants]).  Dr. Lammers’s group focused on the following research questions related to the measurement of procedural expertise:

  • 1
    What are the best methods for measuring technical performance?
  • 2
    How should performance standards for procedural competencies be set?
  • 3
    How can we best evaluate the quality of simulation training and assessment tools?
  • 4
    What are the optimal conditions for learning procedural skills using simulators?
  • 5
    How effectively are simulated procedure skills transferred to real patients?
  • 6
    What factors influence skill retention? Current knowledge?

Several consensus statements were produced by the group, recommending procedure-specific evaluations and scoring protocols and appropriate methods for standard-setting and implementation. The group suggested future research into identifying factors that influence skill acquisition and transfer into the clinical environment, as well as defining learning and decay curves for individual procedures.

Consensus Track 4—Systems Expertise: Effective Simulation at the Organizational Level

Track 4A—Microsystems (Leo Kobayashi, MD, Group Leader [∼30 Discussants]).  Dr. Kobayashi’s group identified methodologies for studying EM microsystems and addressed the following questions:

  • 1
    How should simulation be applied to improve and study EM microsystems? Is there a rational way to propose and determine which types of simulation, in what setting, at what time, for whom, and with what objectives and outcomes will prove useful for EM microsystems improvement?
  • 2
    What research methodologies should be applied to study the use of simulation and establish its value for EM microsystems?
  • 3
    How does one elicit microsystem processes in an integrated manner to unify the simulated care of individual patients (e.g., interactive computer-controlled mannequin) with the simulation of larger-scale systems (e.g., PC-based modeling); i.e., are microsystem and macrosystem simulations reconcilable?
  • 4
    What are some limitations of using a systems approach with in situ simulations?

This group presented an argument that EM systems require evaluation just like medical devices need testing and that relevant outcome measurements can be made by qualitative and quantitative techniques. The group provided suggestions for leveraging current techniques to improve future systems and noted the special considerations of studying complexity in EM.

Track 4B—Macrosystems (Amy Kaji, MD, PhD, Group Leader [∼20 Discussants]).  Dr. Kaji’s group discussed the use of simulation to improve our understanding of systemwide patient safety, disaster care, and communication. The following questions were posed as the group reviewed the current literature and presented recommendations for study topics:

  • 1
    What objective methods and measures may be used to demonstrate that simulator training actually improves patient safety?
  • 2
    How can we effectively feed back information from error reporting systems into simulation training and thereby improve patient safety?
  • 3
    How can simulator training be used to identify disaster risk and improve disaster response?
  • 4
    How can simulation be used to assess and enhance hospital surge capacity?
  • 5
    What methods and outcome measures should be used to demonstrate that teamwork simulation training improves disaster response?
  • 6
    How can the interface of systems be simulated?

The group recommended simulating key elements in clinical care to analyze systemwide safety and efficiency, using real-world quality data to inform simulation training and evaluation, and applying simulation modalities to diagnose and improve disaster response and communication.

Summary

  1. Top of page
  2. Background
  3. Conference planning and logistics
  4. Conference structure and content
  5. Plenary sessions
  6. Research questions and directions
  7. Summary
  8. Acknowledgments
  9. References
  10. Appendices

The 2008 AEM Consensus Conference was designed to help define future directions in simulation-based research in EM and across health care. The conference examined approaches to both developing and defining expertise in four domains: individual/cognitive, group, technical, and systems. The timing of this conference in Washington, DC, with a Simulation Bill before Congress, represented a unique opportunity to help identify priority funding areas and to collaborate with colleagues across disciplines. We hope you find the proceedings of the conference, the consensus topic papers, and the original articles in this special issue of the Journal useful in helping to advance the science of simulation in health care.

Acknowledgments

  1. Top of page
  2. Background
  3. Conference planning and logistics
  4. Conference structure and content
  5. Plenary sessions
  6. Research questions and directions
  7. Summary
  8. Acknowledgments
  9. References
  10. Appendices

We thank the Conference Planning Committee, the Faculty Advisory Group, and the AEM/SAEM home office and leadership for their support, and acknowledge the significant effort and excellent work of members of the Simulation Task Force, Interest Group, and Technology in Medical Education Committee, all of whom made this conference possible. A special thanks to Glenn Hamilton, MD, SAEM President (2005–6), who founded the Simulation Task Force and whose dedication and mentorship have guided these efforts, and to Michelle Biros, MD, David Cone, MD, and Amy Kaji, MD, PhD, who provided AEM editorial and leadership support throughout the project.

References

  1. Top of page
  2. Background
  3. Conference planning and logistics
  4. Conference structure and content
  5. Plenary sessions
  6. Research questions and directions
  7. Summary
  8. Acknowledgments
  9. References
  10. Appendices

Appendices

  1. Top of page
  2. Background
  3. Conference planning and logistics
  4. Conference structure and content
  5. Plenary sessions
  6. Research questions and directions
  7. Summary
  8. Acknowledgments
  9. References
  10. Appendices

Appendix A: Conference planning AND leadership

Conference Co-chairs

James A. Gordon, MD, MPA (Massachusetts General Hospital/Harvard Medical School)

John A. Vozenilek, MD, (Evanston Northwestern Healthcare/Northwestern University)

Conference Planning Committee

Michelle Biros, MD, MS (AEM Editor-in-Chief)

William Bond, MD (Founding Chair, SAEM Simulation Interest Group)

David Cone, MD (AEM Senior Associate Editor)

Gary Gaddis, MD, PhD (AEM Editorial Board)

Robert Gerhardt, MD, MPH (AEM Editorial Board)

James Gordon, MD, MPA (Chair, SAEM Simulation Task Force and Technology in Medical Education Committee)

Maryanne Greketis (SAEM Meetings Coordinator)

Amy Kaji, MD, PhD (AEM Editorial Board Liaison)

Steve McLaughlin, MD (Secretary, SAEM Simulation Interest Group and Chair, CORD Simulation Committee)

Barbara Mulder (SAEM Associate Executive Director)

Linda Spillane, MD (Founding Secretary, SAEM Simulation Interest Group)

John A. Vozenilek, MD (Past Chair, SAEM Simulation Interest Group)

Stephen Wall, MD, MS (AEM Editorial Board)

Ernest Wang (Chair, SAEM Simulation Interest Group)

Expert Faculty Advisory Group

Consensus Topic Leaders

William Bond, MD (Lehigh Valley Hospital and Health Network)

Rosemarie Fernandez, MD (Wayne State University)

Amy Kaji, MD, PhD (Harbor-UCLA)

Leo Kobayashi, MD (Brown University)

Richard Lammers, MD (Michigan State University

Paul Phrampus, MD (University of Pittsburgh)

Marc Shapiro, MD (Brown University)

Linda Spillane, MD (University of Rochester)

Ernest Wang, MD (Evanston Northwestern Healthcare/ Northwestern University)

Keynote Speakers

Jack Boulet, PhD (Foundation for Advancement of International Medical Education and Research)

K. Anders Ericsson, PhD (Florida State University)

Robert Hanscom, JD (Harvard Risk Management Foundation/RMF-CRICO)

William McGaghie, PhD (Northwestern University)

Jenny Rudolph, PhD (Harvard/Massachusetts General Hospital)

Eduardo Salas, PhD (University of Central Florida)

Kevin Weiss, MD, MPH (American Board of Medical Specialties)

Appendix B: Conference funding and support

Major funding support was provided by:

Federal Government:

Agency for Healthcare Research and Quality (AHRQ) U.S. Department of Health and Human Services Rockville, MD

Foundations:

Josiah Macy, Jr. Foundation New York, NY

Risk Management Foundation (CRICO/RMF) of the Harvard Medical Institutions Cambridge, MA

Medical Associations:

Association of American Medical Colleges (AAMC) MedEdPORTAL Washington, DC

Industry/Manufacturers:

Medical Education Technologies, Inc. (METI) Sarasota, FL

Laerdal Medical Corporation Wappingers Falls, NY

Academic/Affiliated Programs and Institutions:

Rhode Island Hospital Medical Simulation Center Department of Emergency Medicine Warren Alpert Medical School of Brown University Providence, RI

Mayo Clinic Department of Emergency Medicine Rochester, MN

University of Texas at Houston Medical School Surgical & Clinical Skills Center Houston, TX

EM-STAT Center SUNY Upstate Medical University Department of Emergency Medicine Syracuse, NY

Center for Simulation Technology and Academic Research (CSTAR) Evanston Northwestern Healthcare Division of Emergency Medicine Evanston, IL

Lehigh Valley Hospital and Health Network Department of Emergency Medicine Allentown, PA

Yale University School of Medicine Section of Emergency Medicine Department of Surgery New Haven, CT

Allegheny General Hospital Department of Emergency Medicine Emergency Medicine Residency Program Pittsburgh, PA

Michael S. Gordon Center for Research in Medical Education University of Miami Miller School of Medicine Miami, FL

Mount Auburn Hospital Department of Emergency Medicine Cambridge, MA

Additional funding support was provided by:

Peter M. Winter Institute for Simulation, Education and Research (WISER) University of Pittsburgh School of Medicine Pittsburgh, PA

University of South Florida Emergency Medicine Residency Simulation Program Tampa, FL

University of Rochester Medical Center Department of Emergency Medicine Rochester, NY

Wake Forest University  Emergency Department Simulation Program  Winston-Salem, NC

Mount Sinai School of Medicine Department of Emergency Medicine New York, NY

Kalamazoo Center for Medical Studies Simulation Center and Bioskills Lab Michigan State University Kalamazoo, MI

University of California, Davis, School of Medicine Department of Emergency Medicine Sacramento, CA

Drexel University College of Medicine Department of Emergency Medicine Philadelphia, PA

Kyoto Kagako Co., Ltd. Kyoto, Japan

Regions Hospital Emergency Medicine Department Residency Simulation Program St. Paul, MN

Advocate Christ Medical Center Hope Children’s Hospital Department of Emergency Medicine Simulation Center Oak Lawn, IL

Hartford Hospital Simulation Center Hartford, CT

STRATUS Center for Medical Simulation Brigham and Women’s Hospital Department of Emergency Medicine Boston, MA

Massachusetts General Hospital Department of Emergency Medicine Boston, MA

University of Michigan Medical Center Department of Emergency Medicine Ann Arbor, MI

Wright State University Boonshoft School of Medicine Department of Emergency Medicine Kettering, OH

Institute for Medical Simulation  Center for Medical Simulation  Cambridge, MA

CSESaR Center for Simulation Education & Safety Research University of Florida at Shands Medical Center Jacksonville, FL

MetroHealth Medical Center Department of Emergency Medicine Emergency Medicine Simulation Program Cleveland, OH

CIMIT Center for Integration of Medicine and Innovative Technology Boston, MA

Gilbert Program in Medical Simulation Harvard Medical School Boston, MA

The Conference was also endorsed by:

The Society for Simulation in Healthcare

American College of Emergency Physicians

American Academy of Emergency Medicine