Defining Systems Expertise: Effective Simulation at the Organizational Level—Implications for Patient Safety, Disaster Surge Capacity, and Facilitating the Systems Interface

Authors

  • Amy H. Kaji MD, PhD,

    1. From the Department of Emergency Medicine, Harbor-UCLA Medical Center, David Geffen School of Medicine at UCLA, Torrance, CA (AHK), Redondo Beach, CA; the Department of Emergency Medicine, Davis Medical Center (AB), Sacramento, CA; the Department of Emergency Medicine, Mount Sinai School of Medicine (YO), New York, NY; the Department of Emergency Medicine, Brown Medical School (LK), Providence, RI; the Department of Emergency Medicine, Northwestern University (RK), Chicago, IL; and the Department of Emergency Medicine, Evanston Northwestern Healthcare (JV), Evanston, IL.
    Search for more papers by this author
  • Aaron Bair MD, MS,

    1. From the Department of Emergency Medicine, Harbor-UCLA Medical Center, David Geffen School of Medicine at UCLA, Torrance, CA (AHK), Redondo Beach, CA; the Department of Emergency Medicine, Davis Medical Center (AB), Sacramento, CA; the Department of Emergency Medicine, Mount Sinai School of Medicine (YO), New York, NY; the Department of Emergency Medicine, Brown Medical School (LK), Providence, RI; the Department of Emergency Medicine, Northwestern University (RK), Chicago, IL; and the Department of Emergency Medicine, Evanston Northwestern Healthcare (JV), Evanston, IL.
    Search for more papers by this author
  • Yasuharu Okuda MD,

    1. From the Department of Emergency Medicine, Harbor-UCLA Medical Center, David Geffen School of Medicine at UCLA, Torrance, CA (AHK), Redondo Beach, CA; the Department of Emergency Medicine, Davis Medical Center (AB), Sacramento, CA; the Department of Emergency Medicine, Mount Sinai School of Medicine (YO), New York, NY; the Department of Emergency Medicine, Brown Medical School (LK), Providence, RI; the Department of Emergency Medicine, Northwestern University (RK), Chicago, IL; and the Department of Emergency Medicine, Evanston Northwestern Healthcare (JV), Evanston, IL.
    Search for more papers by this author
  • Leo Kobayashi MD,

    1. From the Department of Emergency Medicine, Harbor-UCLA Medical Center, David Geffen School of Medicine at UCLA, Torrance, CA (AHK), Redondo Beach, CA; the Department of Emergency Medicine, Davis Medical Center (AB), Sacramento, CA; the Department of Emergency Medicine, Mount Sinai School of Medicine (YO), New York, NY; the Department of Emergency Medicine, Brown Medical School (LK), Providence, RI; the Department of Emergency Medicine, Northwestern University (RK), Chicago, IL; and the Department of Emergency Medicine, Evanston Northwestern Healthcare (JV), Evanston, IL.
    Search for more papers by this author
  • Rahul Khare MD,

    1. From the Department of Emergency Medicine, Harbor-UCLA Medical Center, David Geffen School of Medicine at UCLA, Torrance, CA (AHK), Redondo Beach, CA; the Department of Emergency Medicine, Davis Medical Center (AB), Sacramento, CA; the Department of Emergency Medicine, Mount Sinai School of Medicine (YO), New York, NY; the Department of Emergency Medicine, Brown Medical School (LK), Providence, RI; the Department of Emergency Medicine, Northwestern University (RK), Chicago, IL; and the Department of Emergency Medicine, Evanston Northwestern Healthcare (JV), Evanston, IL.
    Search for more papers by this author
  • John Vozenilek MD

    1. From the Department of Emergency Medicine, Harbor-UCLA Medical Center, David Geffen School of Medicine at UCLA, Torrance, CA (AHK), Redondo Beach, CA; the Department of Emergency Medicine, Davis Medical Center (AB), Sacramento, CA; the Department of Emergency Medicine, Mount Sinai School of Medicine (YO), New York, NY; the Department of Emergency Medicine, Brown Medical School (LK), Providence, RI; the Department of Emergency Medicine, Northwestern University (RK), Chicago, IL; and the Department of Emergency Medicine, Evanston Northwestern Healthcare (JV), Evanston, IL.
    Search for more papers by this author

  • Discussion participants, listed alphabetically (29): David Adinaro, Aaron Bair, Jim Brown, Laurie Byrne, Michael Cassara, Richard DiPeppe, Yue Dong, Brian Gillett, Leon Haley Jr., Shkelzen Hoxhaj, Shelly Jacobson, Kim Jihoon, Amy Kaji, Ravi Kapoor, Jena Ker, Rahul Khare, Michael Kirchoff, Leo Kobayashi, Scott Korvek, Yasuharu Okuda, Erin Reardon, Michael Richards, Barbara Richardson, Michael Saleh, Chris Sampson, Michelle Sergel, Chris Strother, Peg Weissinger, and Ernest Yeh.

  • Presented at The Consensus Conference for Academic Emergency Medicine, Washington, DC, May 28, 2008.

Address for correspondence and reprints: Amy H. Kaji, MD, PhD; e-mail: akaji@emedharbor.edu.

Abstract

The Institute of Medicine’s report “To Err is Human” identified simulation as a means to enhance safety in the medical field, just as flight simulation is used to improve the aviation industry. Yet, while there is evidence that simulation may improve task performance, there is little evidence that simulation actually improves patient outcome. Similarly, simulation is currently used to model teamwork-communication skills for disaster management and critical events, but little research or evidence exists to show that simulation improves disaster response or facilitates intersystem or interagency communication. Simulation ranges from the use of standardized patient encounters to robot-mannequins to computerized virtual environments. As such, the field of simulation covers a broad range of interactions, from patient–physician encounters to that of the interfaces between larger systems and agencies. As part of the 2008 Academic Emergency Medicine Consensus Conference on the Science of Simulation, our group sought to identify key research questions that would inform our understanding of simulation’s impact at the organizational level. We combined an online discussion group of emergency physicians, an extensive review of the literature, and a “public hearing” of the questions at the Consensus Conference to establish recommendations. The authors identified the following six research questions: 1) what objective methods and measures may be used to demonstrate that simulator training actually improves patient safety? 2) How can we effectively feedback information from error reporting systems into simulation training and thereby improve patient safety? 3) How can simulator training be used to identify disaster risk and improve disaster response? 4) How can simulation be used to assess and enhance hospital surge capacity? 5) What methods and outcome measures should be used to demonstrate that teamwork simulation training improves disaster response? and 6) How can the interface of systems be simulated? We believe that exploring these key research questions will improve our understanding of how simulation affects patient safety, disaster surge capacity, and intersystem and interagency communication.

Simulation ranges from standardized patient encounters and robot-mannequins, to computerized virtual environments and mathematical modeling (note that the simulation literature [e.g., Simulation in Healthcare] appears to lean toward referring to SimMen/METI systems as “full body medical/manikin/mannequin simulators” or “computerized human patient simulators”). Thus, the field of simulation covers a broad range of interactions, from a single patient–physician encounter, to that of the interface between larger systems and agencies. To clarify terminology, we will use the term “computational modeling” for simulation used at the organizational interface, which is distinct from mathematical modeling using statistical methods, and from mannequin or screen-based simulation. The Institute of Medicine’s report “To Err is Human” identified simulation as an opportunity for enhancing safety in the medical field, similar to the way that flight simulation is used to improve aviation safety. Yet, while there is evidence that simulation may improve task performance, there is little evidence that simulation actually improves patient outcomes with respect to morbidity or mortality. Simulation is currently used to model teamwork-communication skills for disaster management, but little research or evidence exists to suggest that simulation improves disaster response or facilitates intersystem or interagency communication. As part of the 2008 Academic Emergency Medicine (AEM) Consensus Conference on the Science of Simulation, our group sought to identify key research questions and thereby set a research agenda that would inform our understanding of how simulation and computational modeling influence issues regarding patient safety, disaster surge capacity, teamwork, and the interface between agencies and systems.

Conference Workshop Proceedings

This work stems from the meetings of the 2008 Academic Emergency Medicine Consensus Conference on the Science of Simulation. Our group was charged with describing how simulation and computational modeling can improve patient safety, enhance disaster management and surge capacity, better understand the role of teamwork in disaster response, and help facilitate the interface between larger systems (e.g., the emergency department [ED] and the emergency medical services [EMS] agency). An interest group consisting of emergency physicians, experts in simulation, disaster medicine, surge capacity, and computational modeling established an online discussion in the months before the conference to consider the parameters of the project. Based on recommended readings from the group, hand searches of specific articles, and an inclusive search strategy, a total of 88 articles were utilized as references (please see Data Supplement S1, available as supporting information in the online version of this paper, for a list of additional references).

The inclusive search strategy involved a MEDLINE query using structured search criteria. Specific key words for the search were as follows: “simulation AND patient safety,”“simulation AND disaster,”“simulation AND surge capacity,”“simulation AND emergency department crowding,” and “simulation AND team communications.” The initial search yielded 860 publications. After relevance screening, as well as a hand search of published bibliographies (by AHK), 88 references were selected for detailed review.

Research questions and themes were derived from direct suggestions in the literature or where clear gaps seemed to warrant investigation. An initial document outlining a core set of 16 research questions was submitted to and considered by the discussion group. Feedback from the group resulted in a substantial revision. The 16 original research questions were consolidated into six working questions, which served as the basis and focus of discussion by a group of 29 researchers and clinicians attending the consensus conference on May 28th, 2008, in Washington, DC. Feedback from that session was incorporated into this final document.

Research Question 1: What Objective Methods and Measures Can Be Used to Demonstrate That Simulator Training Actually Improves Patient Safety?

Background/Rationale:  Intuitively, there are many reasons to believe that simulation in medical education would improve patient safety. Simulation allows physicians to train at performing specific tasks without putting patients at risk. Surgical simulators have been developed for a variety of procedures: endovascular repair of abdominal aortic aneurysms,1 sinus surgery,2 gynecologic surgery,3 orthopedic surgery,4 prostatic surgery,5 amniocentesis,6 and oral surgery.7 Exploiting the volume–outcome relationship (“practice makes perfect”) procedural success is promoted by increasing the experience of the operator8,9 through simulators that allow errors to occur and be played out to their conclusion. This allows participants to learn from their mistakes and see the results of their decisions and actions. Problem-based surgical simulation (e.g., inadvertent ligation of the ureter during hysterectomy) may improve patient safety by training surgeons to anticipate and avoid complications. As for developing clinician diagnostic expertise for enhanced patient safety, Swanson and Clark10 and Dawson et al.11 have created a cardiovascular simulator to improve the recognition of diseased coronary vessels, and a heart sound simulator has been proven to increase medical students’ recognition of pathologic heart sounds when tested with the simulator.12,13 Simulators can also help train physicians to perform endoscopy;14 residents who trained for flexible sigmoidoscopy using a virtual reality simulator were faster and visualized a greater portion of the colon in one study.15 Simulated exercises using a virtual hysteroscope led to lower rates of hysteroscopic complications in another study.16 Additionally, clinicians can expose even subtle difficulties that may be encountered with the human–machine interface.

Simulation facilitates training for the uncommon scenario in which a rapid response is critical, such as with malignant hyperthermia.17,18 Other rare events include incidents involving biological or chemical weapons and radiologic exposures. For these rare but high-consequence events, there is little alternative to prepare healthcare providers and systems, except through simulation, drills (field training exercises), just-in-time online resources, etc. Simulation may also identify specific errors in decision-making steps, thus identifying areas to target for further training.19–21

The following are other published studies that demonstrate that simulator training improves practitioner performance:

  • 1In a randomized controlled trial, Peugnet and colleagues22 used a virtual reality simulator to perform retinal photocoagulation. Surgeons who trained with the simulator performed the procedure as well as those who trained with patients.
  • 2Schwid and colleagues23 studied the impact of a computer screen-based anesthesia simulator in a randomized, controlled trial of 31 anesthesia residents. Residents who had trained on the simulator responded better to critical events on a mannequin-based simulator than those who received training without the simulator.
  • 3Using a randomized, controlled design, Chopra and colleagues24 studied management of simulated critical situations by 28 anesthesiologists. The performance of subjects who trained on the simulator was superior to that of subjects who did not receive the training.
  • 4Derossis et al.25 evaluated surgeons in a randomized study and found that those trained with a simulator had greater proficiency in suturing and mesh placement when tested on the simulator than did the control group. When tested in vivo in pigs, surgeons (both attendings and residents) who had been randomized to the simulator arm were more proficient.26
  • 5Scott and colleagues27 studied the impact of a video trainer on laparoscopic cholecystectomy skills. Residents were randomized to training on a video trainer versus control; those who trained on the video trainer uniformly performed better.

Despite the accumulating evidence that simulator training improves practitioner performance, there is little evidence that simulation training actually results in improved patient outcomes.28,29 To study the effects of simulation on patient safety using traditional, accepted research methods, large cohorts of patients would be needed. Adverse events are fortunately uncommon, and there are a large number of patient-based and system-based confounders. Given these difficulties, task performance has been used as a surrogate measure of patient outcome, although it is an imperfect substitute. It is also unclear which attributes of performance matter most and thereby most affect patient outcomes. Additionally, the best methods and timing (e.g., duration of follow-up) to measure and assess performance and outcome are ill-defined.30 Most studies done to date are limited by methodology, as most measures of performance use the same simulator for both training and testing. Thus, such studies may favor those who have trained on the simulator. Furthermore, proficiency on a simulator does not ensure proficiency in clinical settings, if only due to participants being more vigilant than usual during simulator sessions.31 This limitation is illustrated in the study by Sayre and colleagues32 in which basic emergency medical technicians (EMT-Bs) learned intubation on mannequins. After successfully intubating mannequins 10 times, they were permitted to intubate live patients in the field, where their success rate was only 53%. Accordingly, it must be noted that there are substantial potential risks to simulation-based training. When the simulator does not accurately replicate the task environment, clinicians may acquire inappropriate behaviors (negative training) or develop a false sense of security in their skills.33,34

Recommendations

  • 1Although an imperfect substitute, performance of a procedural task is the predominant means of assessing the impact of simulator training on patient outcome. Other potential surrogate measures for patient outcome include process outcomes and clinical performance benchmarks. For example, how quickly is a patient with an acute ST segment elevation myocardial infarction (STEMI) taken to the cardiac catheterization laboratory for percutaneous coronary intervention (PCI)? How quickly after arrival to the emergency department did the patient with chest pain undergo an electrocardiogram? A simulator mannequin with a STEMI could be taken from the out-of-hospital setting to the ED, the catheterization laboratory, and then finally to the coronary care unit. Problems in achieving process outcomes, such as achieving PCI within 90 minutes, could be identified when each of these transition points is analyzed.
  • 2When possible, however, patient morbidity and mortality outcomes should be utilized as study endpoints.
  • 3For each procedure, timing and frequency of retraining and testing to optimize skill and knowledge retention should be specified, preferably based on research data. Additionally, detailed feedback (debriefing) after every training simulation is critical to maximize student learning.

Research Question 2: How Can We Effectively Feedback Information from Error Reporting Systems into Simulation Training and Thereby Improve Patient Safety?

Background/Rationale:  Ziv35 described the role of simulation in the Israeli medical system, which systematically employs error-based learning using errors made in a simulated environment as a basis for teaching. This has been crucial to changing long-held perceptions of medical errors and has started to change the “culture of safety” within the medical community. Reaching beyond the use of quality assurance metrics and quality improvement interventions with feedback features, Vincent36 has referred to the structured use of near-miss and adverse event reports for prospective system safety enhancement as a “window on the system.”

Recommendations

  • 1At the individual institutional level, use available data (e.g., malpractice claims, patient safety-net data, and hospital quality improvement [QI] information) as a basis for developing simulation-based teaching. Cases leading to claims, complaints, and QI issues could serve as the basis for simulation training.
  • 2Use electronic medical records and order entry to track errors, such as incorrect dosing and drug cross-reactivity, for further simulator training.
  • 3On a regional level, pool malpractice and QI data to develop a large shared database of simulation teaching cases.
  • 4Probe the clinical environment using a high-fidelity mannequin simulator in situ to identify patient safety issues. For example, when testing a new defibrillator, create a cardiac arrest case in the actual clinical arena to preempt errors such as incompatibility of paddles with the new defibrillator.

Research Question 3: How Can Simulator Training Be Used to Identify Disaster Risk and Improve Disaster Response?

Background/Rationale:  The need for proper risk assessment, planning, and preparedness and the implementation of early warning systems has been highlighted by recent natural disasters. The Indian Ocean tsunami of December 2004 affected 13 countries and was responsible for the deaths of more than 250,000 people. Regrettably, disasters occur in regions known to be vulnerable, but where no proper risk assessment studies have been made and no adequate plans for preparedness or mitigation exist.

Computational modeling, another form of simulation, is useful in predicting the geographic extent and duration of various natural disasters, such as earthquakes, floods, and hurricanes; as well as in predicting the number of patients that may be afflicted by a pandemic and in predicting the number of vaccines and personnel that may be required to care for the afflicted population.

Recommendations

  • 1Collaborate with human factors and industrial engineers, who are experts in risk analysis, modeling, and systems analysis. The Human Factors and Ergonomics Society (HFES) hosts a website and has a directory of consultants at http://www.hfes.org/web/Default.aspx. Similarly, the Institute of Industrial Engineers (IIE) hosts a website at http://www.hfes.org/web/Default.aspx.
  • 2Hazard vulnerability analyses should be conducted at the local level to identify the greatest threats. Assessing the potential risks that threaten each region of the world requires adequate understanding of the physics of each type of disaster, historical data of past events, and an accurate interpretation of this data as to what the future impact will be. Because each type of disaster results from different sources, risk assessment methodology will vary accordingly. Adjunctive use of computer modeling may be helpful to determine risk of recurrence, severity of incident, etc. Most importantly, it should be noted that many catastrophic disasters had been modeled prior to their occurrence. For example, during the summer of 2004, the Federal Emergency Management Agency conducted a disaster simulation exercise in which a fictional hurricane named Pam hit the New Orleans area. Hurricane Pam was eerily prescient of Katrina. The purpose of the simulation was to help the New Orleans area develop a “plan of action” for a true disaster, like Katrina. Unfortunately, few of the lessons learned from Pam were implemented prior to Katrina. To make full use of the results of simulated exercises to avert the next disaster, the barrier to implementing the “plan of action” must be understood and overcome.

Research Question 4: How Can Simulation Be Used to Assess and Enhance Hospital Surge Capacity?

Background/Rationale:  Kobayashi et al.37 suggested that carefully orchestrated clinical simulations emphasizing extensive interactions between learner(s) and simulated patients, assisted by coordinated facilitator interventions, may contribute to physicians’ ability to manage multiple patients. Whereas mannequin-based simulation appears to be ideal for improving an individual’s performance, large-scale simulations may improve the performance of an entire system or agency. Potential methods requiring further study include computational modeling of systems (e.g., to assess the surge capacity of a region or hospital or to assess the potential impact of a pandemic), large system drills, unit level drills, and team level training. Additionally, such modeling can be used to explore the potential impact of various surge-related response strategies. Kanter and Moran38 used modeling to estimate the impact of various strategies on intensive care unit capacity. Given a set of needs, resources, and assumptions regarding the incident, the likelihoods of specific expected numbers of presenting victims to the ED and outcomes may be estimated. Hospitals’ capacity to accommodate a surge of patients depends on available supplies and equipment, as well as the number of patients already occupying hospital beds, and available staff. There are various commercially available health care simulation tools. For example, there are commercial products available39 that are specifically designed to examine patient flow analysis, staff utilization and efficiencies, hospital bed demand patterns, throughput, and wait times, etc.

Recommendations

  • 1Individual hospitals may assess their surge capacity and flow constraints by using spreadsheet-based analytic tools, commercially available products, or computational modeling, such as that proposed by Kanter and Moran,38 to examine patient flow analysis, staff utilization efficiencies, hospital bed demand patterns, throughput, and wait times, among others.

Research Question 5: What Methods and Outcome Measures Should Be Used to Demonstrate That Teamwork Simulation Training Improves Disaster Response?

Background/Rationale:  During a disaster, teamwork is critical, as managing an influx of patients with limited resources requires coordination among multiple agencies, institutions, and personnel. Simulation allows the interpersonal interactions of a given scenario to be explored. Thus, simulation is used in crew resource management (CRM) training, where the focus is on improving behavioral skills such as leadership, teamwork, and interteam communication during stressful and potentially overwhelming critical incidents. Emerging data support the premise that such team training can be adapted to clinical situations and may lead to improvements in performance and safety.40,41

Recommendations

  • 1MedTeams© has a validated means of assessing teamwork behaviors. Team dimension rating (TDR) is the term applied to observing team behavior and assigning ratings to each of the five team dimensions using the behaviorally anchored rating scales (BARS). TeamSTEPPS, created by the Department of Defense (DoD) and available at http://www.ahrq.gov/, is derived from MedTeams. While MedTeams provides the participant with consultants and rollout support, TeamSTEPPS is provided as an open-source do-it-yourself kit. Although establishing the effectiveness of simulation in CRM training can be difficult,42 initial work has been done that demonstrates reliability and consistency of performance rating.43
  • 2While there are validated measures of teamwork behaviors, there are currently no defined outcome measures as to what constitutes a “good” disaster response. The obvious goal is to limit damage to life and property. Performance may be improved, but whether it will lead to improved outcomes is not known.44 Substantial research efforts are needed in this area.

Research Question 6: How Can the Interface of Systems Be Simulated?

Background/Rationale:  King et al.45 conducted a simulation requiring field surgical teams to establish an operating room and receive and treat multiple anesthetized swine with battlefield-type injuries. This study revealed that a significant number of communication missteps occurred between triage and the operating room, highlighting the need for improved systems interface.

Recommendations

  • 1Although the study by King et al.45 study was performed with live animals, it could be replicated with multiple manikins to recreate the EMS–ED exchange of information. Similarly, interdepartmental interactions may be simulated. For example, one could identify problems with communication between cardiology and the ED during a simulated case of a patient warranting PCI for an acute STEMI. Progressive simulations, i.e., exercises that follow a simulated patient through sequential care environments, can be conducted in situ to re-create intersystem interfaces.
  • 2Stakeholders from different agencies may participate together in a board game or tabletop exercise in which both are involved. For example, a board game involving a scenario in which there is an outbreak of anthrax could highlight the necessary communication between the emergency physician and the public health department, between the public health department and the emergency operations center, and between the incident commander and the state governor.

Summary

As part of the 2008 Academic Emergency Medicine Consensus Conference on the Science of Simulation, our group identified key research questions to improve our understanding of how simulation impacts patient safety, disaster surge capacity, and intersystem and interagency communication.

Ancillary