SEARCH

SEARCH BY CITATION

Keywords:

  • simulation;
  • postgraduate medical education;
  • emergency medicine

Abstract

  1. Top of page
  2. Abstract
  3. Methods
  4. Results
  5. Discussion
  6. Limitations
  7. Conclusions
  8. References
  9. Supporting Information

Objectives:  The use of medical simulation has grown dramatically over the past decade, yet national data on the prevalence and growth of use among individual specialty training programs are lacking. The objectives of this study were to describe the current role of simulation training in emergency medicine (EM) residency programs and to quantify growth in use of the technology over the past 5 years.

Methods:  In follow-up of a 2006 study (2003 data), the authors distributed an updated survey to program directors (PDs) of all 179 EM residency programs operating in early 2008 (140 Accreditation Council on Graduate Medical Education [ACGME]-approved allopathic programs and 39 American Osteopathic Association [AOA]-accredited osteopathic programs). The brief survey borrowed from the prior instrument, was edited and revised, and then distributed at a national PDs meeting. Subsequent follow-up was conducted by e-mail and telephone. The survey concentrated on technology-enhanced simulation modalities beyond routine static trainers or standardized patient-actors (high-fidelity mannequin simulation, part-task/procedural simulation, and dynamic screen-based simulation).

Results:  A total of 134 EM residency programs completed the updated survey, yielding an overall response rate of 75%. A total of 122 (91%) use some form of simulation in their residency training. One-hundred fourteen (85%) specifically use mannequin-simulators, compared to 33 (29%) in 2003 (p < 0.001). Mannequin-simulators are now owned by 58 (43%) of the programs, whereas only 9 (8%) had primary responsibility for such equipment in 2003 (p < 0.001). Fifty-eight (43%) of the programs reported that annual resident simulation use now averages more than 10 hours per year.

Conclusions:  Use of medical simulation has grown significantly in EM residency programs in the past 5 years and is now widespread among training programs across the country.

High-fidelity simulation using robot-mannequins, partial-task trainers, and screen-based computer programs has emerged across specialties as an important training modality in graduate medical education, particularly in emergency medicine (EM). Simulation enables residents to diagnose and manage both common and rare diseases, practice high-risk procedures, and improve skills such as teamwork and communication in a safe learning environment.

However, there remains little documented evidence on the prevalence and growth of simulation use in EM training programs over the past 5 years, arguably the period of highest growth in the field. A survey by McLaughlin et al.1 of EM programs from September 2002 to June 2003 found that only 29% of residencies used high-fidelity mannequin-simulators to train their residents and only half of these programs were using it more than once per year. Of the 114 original respondents, only 8% reported that EM faculty were primarily managing institutional simulation efforts.

Our objectives in this study were to describe the current role of simulation training in EM residency programs and to quantify growth in use of the technology over the past 5 years. We also sought to assess perceived barriers to the use of simulation for education.

Methods

  1. Top of page
  2. Abstract
  3. Methods
  4. Results
  5. Discussion
  6. Limitations
  7. Conclusions
  8. References
  9. Supporting Information

Study Design and Population

A national survey of residency program directors (PDs) in EM was conducted. The survey was carried out as a follow-up and expansion of an earlier survey of human simulation training in EM residency programs conducted 5 years prior.1 Three of the authors from the original study participated in this follow-up survey. The study received exemption from informed consent requirements from the Institutional Review Board of Mount Sinai Hospital (New York, NY).

Survey Content and Administration

Surveys were distributed to all 140 Accreditation Council on Graduate Medical Education (ACGME)-accredited allopathic programs and all 39 American Osteopathic Association (AOA)-accredited osteopathic programs in EM from February 2008 to March 2008. Three programs were both AOA- and ACGME-accredited, but each was counted only once as an ACGME-accredited program, giving a total of 179 programs.

A paper survey was initially distributed and collected during the 2008 EM Council of Residency Director’s Academic Assembly meeting (February 2008). Each program was asked to fill out one survey, completed by either the PD or associate PD. An identical survey was then distributed electronically, using a Web-based survey tool (http://www.surveymonkey.com/), to all of the PDs or associate/assistant PDs who either did not respond to the initial paper survey or did not attend the meeting. If no response was received, additional e-mail or follow-up phone calls were made to encourage survey completion.

An updated survey based on the 2003 instrument initially used by McLaughlin and colleagues1 was edited and revised by the new study group until unanimous consensus was reached. The final updated instrument (survey questions available as supporting information in the online version of this paper) contained simple demographic, yes/no, and multiple-choice questions related to the use of simulation in EM residency training. These items allowed for direct comparison to the original survey, which included the same kinds of basic questions (“Does your department/residents have access to a mannequin-based high-fidelity simulator?” and “How often are your residents using the simulator for educational or assessment purposes?”).

A cover letter was also attached to the new survey, explaining the purpose and noting key definitions. Mannequin based high-fidelity simulator was defined as a computerized full-body robot-mannequin such as the Human Patient Simulator (HPS) or Emergency Care Simulator (ECS) produced by METI, Inc. (Sarasota, FL); SimMan or SimBaby by Laerdal Medical Corp. (Wappingers Falls, NY); or HAL by Gaumard Scientific (Miami, FL). Partial task simulator was defined as dedicated procedural or skill-oriented equipment that is advanced beyond routine static trainers, such as simulators for central lines/venous access, chest tube placement, bronchoscopy, ultrasound, birthing, and trauma (routine airway/cardiopulmonary resuscitation mannequins did not apply). Screen-based computer simulation was defined as case simulations using an interactive computer interface.

Data Analysis

Results were analyzed by simple frequency tabulations; chi-square analysis was used to compare current and prior survey data.

Results

  1. Top of page
  2. Abstract
  3. Methods
  4. Results
  5. Discussion
  6. Limitations
  7. Conclusions
  8. References
  9. Supporting Information

A total of 134 EM residency programs completed the updated survey, yielding an overall response rate of 75% (comparable to the 73% response rate seen in 2003). Among all programs (Table 1), 122 (91%) use some form of simulation in their residency training. Seventy-six (57%) indicated owning some form of simulation equipment. One-hundred fourteen (85%) specifically use mannequin-simulators, compared to 33 (29%) who used mannequin-simulators in 2003 (p < 0.001). Mannequin-simulators are now owned by 58 (43%) of the programs, whereas only 9 (8%) had primary responsibility for such equipment in 2003 (p < 0.001).

Table 1.   Utilization, Ownership, and Barriers to Use of Simulation Technology among 134 Emergency Medicine (EM) Residency Programs in 2007–2008
Residency Program CharacteristicPrograms, n (%)
Uses any simulation equipment122 (91)
 Mannequin-simulators114 (85)
 Procedural simulators80 (60)
 Screen-based simulators31 (23)
Owns any simulation equipment76 (57)
 Mannequin-simulators58 (43)
 Procedural simulators56 (42)
 Screen-based simulators19 (14)
Estimated annual simulation use per resident
 None/unknown12 (9)
 1–5 hours24 (18)
 6–10 hours40 (30)
 11–20 hours36 (27)
 21+ hours22 (16)
Barriers to simulation use
 Faculty time constraints88 (66)
 Lack of faculty training73 (54)
 Cost of equipment/operations63 (47)
 Lack of support staff42 (31)
 Lack of access to learners5 (4)
 None12 (9)

Fifty-eight (43%) of the programs reported that annual resident simulation use now averages more than 10 hours per year. Among the programs using simulation (n = 122), the most common use of simulation was to teach and practice resuscitation skills (n = 118; 97%), followed by airway management (n = 109; 89%), procedure skills (n = 98; 80%), disease-specific management (n = 93; 76%), teamwork (n = 91; 75%), professionalism (n = 72; 59%), and error avoidance (n = 64; 52%).

A constraint on faculty time was the most commonly perceived departmental barrier to simulation use (mentioned by 88 programs; 66%), followed by lack of faculty development (n = 73; 54%), and the costs of equipment/space (n = 63; 47%). Four EM programs reported offering fellowship training opportunities in medical simulation.

A post hoc sensitivity analysis was conducted to account for nonresponders. Even if all of the nonresponders do not use simulation, mannequin simulation use is 64% among all accredited programs, still significantly greater than in 2003 (p < 0.001).

Discussion

  1. Top of page
  2. Abstract
  3. Methods
  4. Results
  5. Discussion
  6. Limitations
  7. Conclusions
  8. References
  9. Supporting Information

There has been a remarkable increase in the deployment and use of simulation equipment among EM residency training programs over the past 5 years. Use of high-fidelity mannequin simulation is now widespread (up from 29% to 85%). Ninety-eight programs use simulation more than 5 hours per year, compared to only 19 who used simulation more than once per year in 2003. Moreover, the importance of simulation has led to increasing departmental ownership of simulation equipment and space.

In addition to its teaching value for emergency care, training programs are starting to look to simulation as one tool for assessing resident performance.2,3 The interactive and dynamic nature of the simulation platform not only permits evaluation of clinical management and technical skills, but also can provide insight into other ACGME competency domains. For example, many of the programs in our survey indicated using simulation for training in areas such as professionalism (59%) and teamwork (75%). Simulation demonstration projects for systems-based practice have also been published in the EM literature.4 Other authors have begun to delineate those areas they feel are most appropriate for simulation and to set a research agenda for future work.2,5–8

Although costs of equipment and operations were indicated as the major barrier to simulation use in the 2003, departments appear to have prioritized funds or otherwise taken advantage of institutional resources to expand use of the technology. Now, faculty time and training are currently perceived as the primary obstacles to simulation use, although cost is still a notable concern. While faculty often perceive that simulator-based training requires a significant time commitment, the amount of time required depends on the objective of the session and the product desired. One strategy for decreasing the time spent in case development is to use shared case banks for simulator curricula, such as those sponsored by the Society for Academic Emergency Medicine (SAEM; http://www.emedu.org/simlibrary/) and its peer-review partner for such material, the Association of American Medical Colleges MedEdPortal (http://www.aamc.org/mededportal).

National organizations have also begun to provide faculty development opportunities to support the increase in simulation-based training in EM. SAEM’s Technology in Medical Education Committee, and Simulation Interest Group, together with efforts by the Council of Emergency Medicine Residency Directors Committee on Simulation, the American College of Emergency Physicians Academic Affairs Committee, and the EM Special Interest Group of the Society for Simulation in Healthcare can all help develop and promote educational offerings for interested faculty and departments. Already we are seeing the early development of simulation fellowship opportunities for EM residency graduates at selected institutions.

Limitations

  1. Top of page
  2. Abstract
  3. Methods
  4. Results
  5. Discussion
  6. Limitations
  7. Conclusions
  8. References
  9. Supporting Information

Our response rate, while relatively high at 75%, still leaves our data subject to response bias. It is possible that programs uninterested in simulation may not have wished to complete a survey, giving us a falsely elevated percentage of programs in EM using simulation. However, even if we assumed that all of the nonresponders did not use simulation, the data still reflect significant growth in the field. Another limitation for our survey tool was that it was not a formally validated instrument; however, the primary outcome measures were very basic yes/no questions (“Does the ED own a high-fidelity mannequin simulator?”) with clear definitions of terms embedded in the instrument. Moreover, we were able to draw on experience with the prior survey that provided a foundation to update, pilot, and iteratively revise the instrument among our expert panel.

Conclusions

  1. Top of page
  2. Abstract
  3. Methods
  4. Results
  5. Discussion
  6. Limitations
  7. Conclusions
  8. References
  9. Supporting Information

There has been significant growth in the availability and use of simulation in EM residency training over the past 5 years. This has occurred despite perceived barriers including faculty time and training. Faculty development initiatives and measures, such as easy access to a well-developed case bank and valid assessment tools, may be instrumental in fostering further growth.

References

  1. Top of page
  2. Abstract
  3. Methods
  4. Results
  5. Discussion
  6. Limitations
  7. Conclusions
  8. References
  9. Supporting Information

Supporting Information

  1. Top of page
  2. Abstract
  3. Methods
  4. Results
  5. Discussion
  6. Limitations
  7. Conclusions
  8. References
  9. Supporting Information

Data Supplement S1. Survey questions

Please note: Wiley Periodicals Inc. are not responsible for the content or functionality of any supporting information supplied by the authors. Any queries (other than missing material) should be directed to the corresponding author for the article.

FilenameFormatSizeDescription
ACEM_195_sm_DataSupplementS1.pdf12KSupporting info item

Please note: Wiley Blackwell is not responsible for the content or functionality of any supporting information supplied by the authors. Any queries (other than missing content) should be directed to the corresponding author for the article.