Health care education organizations have begun to examine accreditation processes as possible facilitators of growth and excellence in medical simulation. In 2005, the Society for Academic Emergency Medicine (SAEM) recognized the importance of sharing quality standards in simulation-based education and developed a Simulation Task Force that offered a consultation service to new simulation programs. As the number of emergency medicine (EM) simulation programs grew to over 100 nationwide, SAEM considered the expansion of consultation efforts to include a formal accreditation program, paralleling similar efforts in other specialties.1 As part of that organizational review, this paper was commissioned to survey existing accreditation programs, analyze existing EM simulation initiatives, and explore the feasibility and structure of an EM-specific accreditation process. The goal of this review is to evaluate the development of simulation-based consultation and accreditation approaches specific to EM. We suspect the review will also be helpful to other specialty groups that are interested in similar work and will be useful in catalyzing a common platform to help unify common efforts across the health care spectrum. The four programs reviewed here (American College of Obstetrics and Gynecology [ACOG], American College of Surgeons [ACS], American Society of Anesthesiologists [ASA], and the Society for Simulation in Healthcare [SSH]) have all had the opportunity to review and comment on the manuscript.
Simulation-based education has grown significantly over the past 10 years. As a result, more professional organizations are developing or implementing accreditation processes to help define minimum standards and best practices in simulation-based training. However, the benefits and potential pitfalls of sponsoring and implementing such programs have yet to be fully evaluated across specialties. The board of directors of the Society for Academic Emergency Medicine (SAEM) requested an evaluation of the potential to create an emergency medicine (EM)-based Simulation Consultation and Accreditation Service. In response to this request, the Simulation Accreditation and Consultation Work Group, a subgroup of the Committee on Technology in Medical Education (now Simulation Academy), was created. The work group was charged with: 1) reviewing current benchmarks and standards set by existing simulation accreditation programs; 2) analyzing current EM simulation program structures, including leadership, administrative, and financial components; and 3) proposing a potential model for EM-based simulation accreditation. This article outlines currently existing and proposed accreditation models and identifies components that support best practices. It then goes on to describe three general programmatic models to better understand how simulation training can be operationalized in EM. Finally, the work group uses this collective information to propose how an accreditation process, in concert with the SAEM Simulation Consultation Service, can enhance and advance EM simulation training.
ACADEMIC EMERGENCY MEDICINE 2010; 17:1093–1103 © 2010 by the Society for Academic Emergency Medicine
Accreditation programs: background
As of January 2009, the ACS, ASA, and SSH have simulation accreditation programs that are either in place or in development,2–7 and the ACOG has created a simulation consortium with the “goal of providing voluntary access to standardized and validated surgical curricula” (personal communication, American College of Obstetrics and Gynecology, November 11, 2008). While it is likely that other health professional organizations are also exploring an accreditation process, this document examines these four efforts as they represent the earliest organized efforts to date. The primary characteristics of these accreditation and standard-setting programs are detailed in Tables 1 and 2 and are summarized below. While there are multiple characteristics of each program that bear examination, this discussion will focus on three areas: 1) scope of accreditation program, 2) accreditation format, and 3) criteria for accreditation. Our commentary here is based on our working group’s independent review of publicly available material (as of December 2009) and descriptions. While we have made every attempt to accurately portray the scope and intent of each program, the most accurate and up-to-date information can only be provided by the relevant organizations themselves.
|Sponsoring Organization||Scope and Model†||Personnel Criteria||Learner Criteria||Curricular Requirements||Hardware/Infrastructure|
|• Broad||• Institute director appointed for 3 years at 25% time protected|
• Surgical director is FACS and has 10% time protected
• Administrator with 50% time for center
• Coordinator with 50% time
|• Must include surgeons PLUS 3 specialties/learner groups, e.g.:|
• Must demonstrate the effectiveness of curriculum
• Must provide evidence of:
—Long-term follow up of learners
—Maintenance of skills
|• Incorporates procedural and cognitive skills|
• Curriculum development involves:
—Development of objectives
—Selection of instructional methods
—Creation of instructional materials
—Assessment of effectiveness
• Educational programs are accredited by the LCME, ACGME, ACCME, or equivalent
• Faculty are appropriately trained
|• 1200 sq. ft contiguous with face to the public|
• No less than 4000 sq. ft. additional space for storage, lounge, etc.
• Can accommodate a minimum of 20 trainees at a time
• Teleconferencing available
• Internet capable
• Adequate space for administration
• Adequate space for skills trainers
• Annual budget can support the activities of the Institute
• Provides a mission statement
• Provides an organizational chart
• Establishes a Steering Committee or Advisory Board
• Level 1,* Comprehensive
• Level 2, Basic
|• Specialty specific||• Must have an established mechanism for instructor training, evaluation, credentialing|
• Program director should hold a doctoral degree and academic appointment at an accredited institution
• Course director should:
—Be credentialed as an instructor
—Hold appointment in the Department of Anesthesiology
|• CME (required)|
• UME, GME (optional)
|• Program must offer courses for anesthesiology CME|
• Quality assurance program must be in place
• Methodologically sound program for curriculum development and assessment must be evident
• Sample curriculum and scenario required
|• Provides a mission statement|
• Organization should maximize the likelihood that course quality will be maintained
• Document governance and financial model
• Program leadership and financial stability required
• Facilities should be sufficient for the coursework offered, including parking, meals, etc.
• Written policies and procedures should exist
• Demonstrate necessary educational technology to conduct courses
|• Single level|
|• Specialty specific||• Not currently defined||• GME|
• Residents in approved OB/GYN programs
• All residencies can request access to consortium institutions for training of their residents
|• Simulation-based surgical skills education with patient safety focus|
• Goal of developing a common curricula that can be taught by all consortium institutions
• Goal of providing validated simulation-based education
|• Consortium institutions are defined as state-of-the-art surgical simulation centers|
• States goal of developing standardized teaching methods that can be utilized by all consortium institutions
|Standards||Personnel Criteria||Curricular Requirements||Hardware/Infrastructure|
|• Program is administered by one or more individuals that are academically and/or experientially qualified|
• Program director has overall responsibility and authority for the program
• Program director has adequate time in the role to achieve the goals of the program
• Staffing levels are appropriate to meet the mission
• A finance officer is involved in budgeting process
|• Specific processes are used to determine what technologies/applications are selected for use in education, assessment, and/or research|
• The technology used is tied to the educational, assessment, and/or research objectives
• Both content and simulation expertise is available
• A written plan for systematic quality improvement includes assessment of learner outcomes
• Demonstrates improvements made based on evaluations
|• Mission statement includes purpose, audiences served, types of activities, and expected results|
• Governing body reflects the administration, instructors, and stakeholders of the center
• Provides an organizational chart and job descriptions
• Organized budgeting process is in place
• Budget is adequate to achieve stated mission
• Strategic plan demonstrates an effective planning process
• Policies and procedures are in place to ensure quality assurance, confidentiality, adherence to regulatory boards, and effective resource utilization
|Assessment||• Instructors and staff are qualified by virtue of education and experience|
• Instructors and staff are routinely evaluated to ensure competence
• Adequate technical support for data analysis is present
• Human factors, psychometric, and statistical support available when indicated
|• Processes are in place to assure that assessment methods and tools are appropriate, reliable, and valid||• Facilities and technologies are appropriate for the individuals being assessed and the level of assessment|
• IRB and data security needs are met and documented
|Research||• Instructors demonstrate a capability to perform research|
• There is a designated director of research with roles delineated in the organizational structure and adequate support time
• There are instructors with specific research training and internal/external documentation of collaboration
|• There is evidence of publication and/or presentation of research findings in peer reviewed forums|
• There is documentation of mentoring simulation researchers
|• Program has an established record of research|
• The mission statement includes a specific commitment to research
• Evidence of successful efforts to obtain research support exists
• Program uses a scholarly approach to training assessment
• Documentation of IRB adherence and data security protocol
|Education||• Program oversight is by an expert in simulation education|
• Program facilitates professional development for instructors
• Instructors engage in certified ongoing training to improve skills
• Instructors are familiar with capabilities and limitations of simulation modalities
|• Offers comprehensive simulation-based learning|
• Educational materials are evidence-based, reliable, and valid
• Simulation modalities are appropriate for learning objectives
• Curriculum design process involves currently understood simulation education theory
• Program has the ability to offers CME
|• Records are kept on all instructors and instructor professional development|
• Feedback incorporated into programming
• Program continually updates and improves its courses
• Record keeping supports evaluation, validation, and research of curriculum
• Records of learner, instructor, and coordinator activities are maintained
Scope of Accreditation Programs
The ACS and SSH each offer a simulation accreditation program with a broad institutional focus that encompasses all forms of simulation.2,4,5 As a cross-disciplinary organization, SSH has proposed the broadest view of simulation and simulation-based education, intended to encompass all fields. Listed requirements for instructors, equipment, and processes are flexibly defined to include the wide variability in simulation programs and centers worldwide. The ACS process is also broadly inclusive, additionally offering focused guidance on programmatic components specific to surgical training. For example, to obtain the highest level of ACS certification, a Director of Surgical Simulation must be guaranteed 10% protected time for educational and administrative duties.2 In setting this standard, ACS has made an important comment on the need for dedicated faculty to build and sustain simulation efforts. This requisite has the potential to affect faculty recruitment and leadership decisions, academic promotions, and overall simulation strategies at institutions seeking accreditation.
The ASA program (pilot phase) is more specialty specific in its content, concentrating on curriculum development and instruction in anesthesia.6 Infrastructure and organizational requirements are primarily focused on anesthesia-specific training, whether as a free-standing simulation program or as a part of a larger multidisciplinary center. While it is too early to speculate on the final ACOG process, early information suggests that its focus will be on institutional access, common curricula, standardized teaching, and validation of simulation-based education specific to OB-GYN. ACOG’s pilot programs began in earnest in 2009.
Currently there are three approaches to accreditation criteria: 1) a single criteria-based system, 2) a multilevel system, and 3) a modular system. The ASA program deploys a single criteria approach.7 In this accreditation model, there is one set of criteria for standard accreditation; all programs must meet the same benchmark, for example, in criterion areas such as faculty expertise and levels of instruction. The ACS program defines accreditation standards at two different levels across all content areas, which include learners, curriculum, support, and resources.2 SSH uses a modular approach, recognizing that some programs might choose to focus in one area but not others.4 SSH defines “Core Standards” that all programs must meet, in addition to standards in at least one of three areas: assessment, research, and education. Programs meeting standards in any one area may then seek additional accreditation in the area of system integration and patient safety; accreditation across multiple domains implies a more comprehensive program.
The multilevel or modular approaches are inherently more complex, both in process and in designation. Both recognize that resources and curricular requirements will differ depending upon educational goals and the size of the supporting institution or department. Multiple levels of certification (e.g., ACS) are useful in discriminating between smaller, more focused simulation centers, and larger, institution-wide centers; a modular approach (e.g., SSH) allows description and categorization tailored to individualized strengths and attributes of each program.
Criteria for Accreditation
All three programs with described accreditation standards (ASC, ASA, SSH) focus on four main areas: 1) curriculum, 2) instructor/personnel qualifications, 3) equipment and technology, and 4) organization and supporting infrastructure. However, the level of emphasis given each area varies by program. The ACS accreditation standards list specific infrastructure and equipment requirements including square footage minimums, designated office space, and networking and teleconferencing capabilities. The ACS guidelines have more loosely defined criteria for curriculum development and assessment; however, the site visit that is required for accreditation allows surveyors to look for faculty and programmatic development that supports the mission and spirit of the ACS accreditation process. Neither the ASA nor the pilot SSH plan delineate specific space or equipment requirements. Rather, these processes require simulation programs to demonstrate adequate space and hardware to carry out program-specific educational missions. This makes the process somewhat more flexible, but also can soften the external mandate for capital items.
The ASA program contains the most detailed requirements for curriculum content and evaluation, particularly in the continuing medical education (CME) category. To achieve accreditation status, programs must provide CME-level programming and course evaluation. Additionally, accreditation applicants must submit a simulation-based scenario that becomes part of a larger anesthesia simulation curriculum. These requirements support the ASA’s goal of developing “a meaningful (simulation) program that could be reliably conducted on a regional and repetitive basis.”8 Such a goal is logical for a specialty-based accreditation process and is somewhat similar to ACOG’s early efforts, which target graduate medical education (GME) curricula. The initial pilot phase of the SSH plan lists accreditation criteria specific for the area of interest (assessment, education, research, or systems integration) in addition to a set of core criteria common to all accreditation areas (Table 2). While there are many similarities between the SSH requirements and those of the ACS and ASA programs, the accreditation offered by SSH specifically in simulation research is somewhat unique. Overall, the research requirements are targeted toward improving data acquisition among simulation programs and supporting scholarly research activities.4 Interestingly, the ACS recently appointed a national research committee to promote collaborative, multi-institutional research studies and to further develop its simulation research agenda.
Potential criteria for em simulation accreditation
Initiating an evidence-based approach to determining the merits of an EM-based simulation accreditation process is challenging due to the overall lack of evidence and the relative novelty of such an effort. When considering existing health care–related accreditation efforts, it became clear that both positive and negative outcomes could result from any accreditation process. Our work group therefore set an overall goal for any product that would result from its work, stated as follows: Our goal is to advance emergency medicine education by supporting excellence in the development, provision, and study of simulation-based training. To accomplish this goal, we dissected the components of the current accreditation efforts as summarized above, reviewed the findings of the 2008 AEM consensus conference “The Science of Simulation in Health Care,” and conducted an extensive literature review to identify the components of simulation training that are most likely to affect education, assessment, and research in EM. These efforts are documented in this report, which subsequently offers a structure that would support a variety of simulation efforts, encourage collaboration at all levels, and provide a mechanism to identify and categorize EM-affiliated simulation-based programs.
The quality of instruction and debriefing has long been recognized as critical to the success of any simulation-based educational effort. Instructor training courses are currently conducted throughout the country, with emergency physicians (EPs) leading many of these efforts.9 Despite such efforts, lack of faculty time and training are cited as one of the greatest perceived barriers to simulation education.1,10,11 Any EM simulation accreditation or consultation processes should capitalize on the current simulation instructor expertise and seek to expand it whenever possible. By setting training recommendations for simulation instructors, an accrediting body would be targeting one of the most crucial components of simulation-based training. However, instructor expertise must be carefully developed and requires significant resource support for faculty development (course fees, protected time). An accreditation or consultation service must ensure that high-quality simulation instructor courses are available to EM faculty and that these courses meet both accreditation standards and the constantly changing needs that new technologies present. Faculty could be trained locally, enroll in distance training, or attend national courses offered at leading institutions or as a component of annual specialty meetings. Training need not be EM-specific, but rather should address issues and techniques specific to simulation-based instruction.
Coincident with the need for instructor training is the need for dedicated faculty time for simulator-based teaching.1 Appropriately customizing curricula and conducting dedicated simulation sessions is extremely time-consuming, even when simulation cases are pulled from a “shared” case bank. In the traditional educational paradigm, a single faculty member can teach hundreds of individuals with a single 1-hour lecture, but only 5 to 10 learners in a high-fidelity simulation.9,11 One anticipated effect of ACS accreditation is an increase in overall institutional support and faculty recognition for simulation activities (Table 1).5 As noted, dedicated support for surgical faculty who lead simulation efforts is required. Similar requirements would be helpful to EM faculty as part of an EM-based accreditation process.
EM Simulation Curriculum
The development of quality EM-based simulation curricula is crucial to the effectiveness of any simulation program. Standardized case development and validation for high-stakes assessment (residency promotion, provider credentialing) requires significant time and expertise and can present barriers to institutional program advancement. The ASA recognizes these barriers and incorporates into its accreditation process the development of a national anesthesia case bank geared toward CME learners. Current efforts in EM to share and peer-review simulation cases through the Association of American Medical Colleges’ MedEdPortal (http://www.aamc.org/mededportal) and the SAEM Simulation Case Library (http://emedu.org/simlibrary) represent similar efforts that could be expanded as a resource for an EM-based simulation accreditation process.
While simulation-based training does not have a strong presence in CME courses, there is clearly interest in using simulation as a way for EPs to maintain their skills.12 Additionally, the American Board of Emergency Medicine is beginning to explore high-fidelity simulation as a potential medium for physician assessment.13 Vozenilek and Gordon12 proposed a simulation-based CME network designed to provide CME programming at the local, regional, and national levels. An accreditation process could serve to advance any efforts made toward development of a simulation CME program and, as stated above, increase the development and validation of high-quality cases. Procedural simulation CME would also be attractive as a method for obtaining hospital privileges.
Equipment and Infrastructure
Hardware (i.e., mannequins, square footage, audiovisual equipment) often plays a relatively small role in defining simulation excellence and is highly dependent on individual programmatic needs. Expensive, realistic simulators can provide high levels of physical fidelity. In the dynamic EM environment, however, psychological fidelity may be more important than advanced physical fidelity.14,15 Given resource constraints among many EDs, an EM simulation accreditation process should focus on the bare minimum of equipment to run an effective scenario (mannequin, clinical setting, debriefing space) based on each program’s educational objective. Rigid infrastructure and hardware criteria can significantly affect financial priorities, focusing budgeting decisions on hardware needs rather than personnel and curricular requirements.
Due to the rapid growth of simulation technology and methodology, any accreditation or consultation process must be flexible enough to expand as needed, yet still meet the basic needs of smaller, independent, departmental-sponsored EM simulation programs. While a modular format such as SSH’s functions well for a broad, all-encompassing accreditation process, a specialty-specific accreditation program in EM might fare better, with a multilevel categorization program that can add components as needed. For instance, initially a single level of accreditation with minimum hardware and curricular requirements could be created (e.g., a single simulator used by qualified faculty). As EM-related CME and credentialing programs develop, multiple levels of accreditation might be necessary to further delineate simulation centers where a nationally endorsed curriculum and assessment platform is available. The goal of a tiered system is not to discourage participants from smaller centers, but rather to support and recognize the need for training at local, regional, and national levels. One such framework for a multilevel EM simulation accreditation program is presented in Table 3.
|Level||Scope||Instructor Requirements||Curricular Requirements||Oversight/Infrastructure||Assessment|
|1||• Capable of delivering high stakes assessment programs|
• Includes objectives of levels 2 and 3
|• Maintains high-level instructor qualification to deliver high stakes simulation exams at a national level|
• Meets level 2 requirements
|• National curricular objectives and credentialing processes|
• Meets level 2 requirements
|• Program complies with the regulations put forth by relevant specialty, credentialing, or certification bodies (local, regional, national)|
• Meets level 2 requirements
|• High level credentialing standards, at least on par with existing national accreditation processes, must be met|
• Meets level 2 requirements
|2||• Offer high-quality UME, GME, and CME training|
• Conducts train-the-trainer courses and other regional and national conferences
|• Program director maintains instructor credentialing for simulation faculty|
• Instructors participate in train-the-trainer courses
• Meets level 3 requirements
|• Has a consistent process in place for curricular development|
• Multidisciplinary courses are offered
• CME courses are offered on a regular basis
• A curriculum committee or equivalent to provide direction and priorities
• Meets level 3 requirements
|• An oversight committee or governing board exists to allow stakeholders from CME, UME, and GME departments the opportunity to impact the simulation program|
• A clear mechanism exists for feedback regarding the simulation program at the departmental, institutional, and hospital (if appropriate) levels
• Meets level 3 requirements
|• Assessments performed at the CME level must meet practice standards, have rigorous oversight, and use evidence-based methodology when developing assessment tools|
• Meets level 3 requirements
|3||• Offer high-quality training for emergency medicine learners|
• Participates in local simulation-based training events as appropriate
|• Program director is board eligible or board certified in emergency medicine|
• Program director and simulation faculty have attended an instructor course and demonstrate a plan for continuing education and improvement on a yearly basis
• Program demonstrates a clear 360º evaluation plan for instructors
• Program provides a videotaped debriefing for review
|• A clear, evidence-based curricular approach exists|
• A 360º evaluation of the curricular approach is part of the simulation program
• Use of simulation technology is clearly linked to learning objectives
• A rationale for use of educational materials is provided
• Program provides one sample case with objectives, learner group, and required resources
|• Program director demonstrates a level of protected time commensurate with current and projected simulation activities|
• Program director has a clear faculty development plan
• Process improvement and feedback mechanisms are clearly described
• Program’s operations are transparent
• Financial and personnel support match the program’s goals and objectives
• Equipment, space, and technology are appropriate for the program’s objectives
|• All assessment tools are evidence-based and have a minimum of content validity|
• Any high stakes assessment done at the local level must be valid and have appropriate rater and instructor oversight
• Use of assessment tools is appropriate
Benchmarks for em involvement in simu-lation centers
A recent survey demonstrates a marked increase in the use of simulation by EM residency programs over the past several years.1 Additionally, more and more EPs are taking leadership roles in hospital- and university-based programs. This section outlines three distinct models of EM simulation programs, including the roles of EM faculty and the mechanisms of financial support. While there are likely unique programs that do not fit any one category, the majority of existing EM simulation programs are organized around one of these models.
Independent EM Simulation Programs
An independent EM-supported simulation program is defined as one where equipment ownership, educational instruction, and program administration all stem from a single department or division of EM (Figure 1). In this model, the simulation program would derive all of its leadership and direction from one or more designated EM faculty members. Program priorities and expenditures would relate directly to EM needs, with financial support coming directly from the sponsoring EM department or division. The advantages of such an arrangement are: 1) the ability to focus curricular development on EM-based education, 2) the flexibility to schedule and organize curricula in a way that works within current educational programming, and 3) the ability to focus limited resources on EM requirements. Even with a single-specialty focus, there are significant challenges associated with a free-standing EM simulation initiative. First, the administrative component often falls on faculty members, as most departments do not have the funds to support dedicated administrative time. While the administrative burden may initially be small, it can grow as more equipment and learners accrue. This can result in inefficiencies, where departments subsidize faculty time to perform administrative duties rather than curriculum development and teaching. Second, the financial burden of purchasing and supporting expensive simulation equipment falls directly on the supporting department. In many cases, the maintenance costs can become prohibitive and replacement funds nonexistent, resulting in lost educational time due to mannequin breakdown and malfunction. Finally, such an independent model is not inherently designed to support interdepartmental collaboration—although the collaborative nature of EM practice lends itself nicely to such work. This is worth noting in light of the recent focus on interdisciplinary training and multicenter educational research.
Whether one focuses on the pros or cons to this model, it is important to note that many EM simulation initiatives started as independent programs, and their success is due to forward-thinking departmental chairs and residency leadership. As a result of such efforts, many EM faculty members have been instrumental in guiding universities and health care institutions in the development of their own large multidisciplinary simulation centers. EM faculty members now figure prominently in the leadership of several major university simulation programs.
Institutional Centers with EM Programs
In the past several years it has become more and more common for universities and large health care corporations to build simulation centers geared toward serving multiple health care disciplines and learners. This has been further encouraged by accreditation processes like that of the ACS, which require multilevel, multidisciplinary training. As a result, EM simulation programs at institutions with large simulation centers are often structured quite differently (Figure 2). In this model, the simulation center has its own leadership, administration, and support staff. The EM department or division is usually responsible for providing one or more trained faculty members (adequate training defined by the larger simulation center administration) to assist core simulation center staff with the construction and execution of simulation scenarios for EM training. The EM department does not own the mannequin, and financial responsibility for equipment, maintenance, and support personnel comes directly from the institution or primary center funding source. Often the individual departments are charged a yearly fee or per usage fee to help offset costs. This financial arrangement depends largely on the mission of the center (e.g., undergraduate medical education, GME, CME) and the flow of monies at that particular institution.
In an institutionally based model, EM simulation programs can benefit from the resources of the university by having access to equipment and instruction that could otherwise be financially prohibitive. Additionally, such a multidisciplinary structure naturally encourages collaborative efforts in teaching and research. Further research support is often available in the form of medical education research faculty and statistical support, both resources that are often unavailable at the individual departmental level. Unfortunately, the demand for these services can overwhelm even a large, well-structured center, resulting in competition for equipment time and personnel resources. Simulation center priorities can also fail to meet the needs of EM departments, especially if the governing body of the simulation center is not multidisciplinary in focus. It is important to monitor the progress of EM initiatives within larger simulation centers to ensure that EM faculty development, education, and research are sufficiently supported within the overall effort.
Satellite Simulation Model
The third model of EM simulation is a hybrid of the first two models, where a large central simulation center exists in concert with a satellite EM-based simulation program (Figure 3). Such a structure could exist in institutions where independent EM simulation programs preceded the construction of a larger, institution-based center. Another reason for having such a model is if EM needs are not being met by the larger simulation center or if location is a barrier to training. The recent focus on in situ simulation raises the question of whether EM initiatives are best situated onsite, where faculty and trainees can have access to simulation exercises without having to leave the clinical area.16,17 However, even with a satellite EM program, a great deal of financial responsibility rests with the EM department to purchase and maintain its own equipment. This model does afford the potential advantage of being able to collaborate with and utilize the resources of a larger, institutional simulation center, thus enhancing research efforts, large-scale CME courses, and multidisciplinary training. During such collaboration, administrative responsibilities could be borne primarily by the larger center, thus removing a significant burden from the individual EM department.
Regardless of the model used, it is clear that there has been remarkable growth of simulation use among EM departments.1 Additionally, the scope of EM simulation has expanded significantly to include new methodologies for training and assessment.9 If EM simulation programs are to continue to move forward, it will be important to work toward quantifying the financial and faculty costs of a successful EM simulation effort. Additionally, it would be useful to better understand and plan for the use of simulation work as part of an academic faculty portfolio.
Advancing an em simulation consultation and accreditation process: one model
SAEM currently sponsors consultation services to assist programs with establishing, funding, and operating simulation programs or centers. These services are administered through the SAEM Consultation Service and are designed to help faculty who are initiating simulation efforts at their institutions by providing guidance and recommendations from experienced simulation experts. At this point, the service has not been widely utilized, in part because SAEM simulation expertise is widely shared through individual visiting professorships and CME venues. Despite these mechanisms, new members in the SAEM Simulation Academy and the SSH Emergency Medicine Interest Group continue to report a need for guidance on program funding, curriculum development, and instructor training.
To meet the needs of the EM simulation community in a uniform and comprehensive way, it is important to continue to offer expertise in the form of a consultation service. Additionally, it is necessary to provide both simulation faculty and education stakeholders (deans, program directors, learners, health care institution leadership) with guidelines that outline the critical components of simulation-based training and delineate a path for meeting these standards. By setting standards for simulation, SAEM can provide clear goals for simulation programs. By providing a consultation service, SAEM can offer expertise to help programs meet their individualized goals. Both components seek to increase faculty and learner involvement in simulation-based education by providing evidence-based benchmarks, as well as mechanisms to reach those benchmarks. Since each program, institution, and learner base is unique, the individualized approach provided through the consultation service is extremely important and could significantly assist in the development of new programs or expansion of current simulation efforts.
In the above sections the argument for instructor training and support is clearly stated. Several SAEM members currently participate in and/or direct nationally recognized simulation instructor training programs. This level of expertise should be leveraged to raise the overall level of simulation training across EM programs and to ensure that all programs have the opportunity to meet the standards set forth in an accreditation program. One direct way to accomplish this is to incorporate simulation-based faculty development into an annual sponsored SAEM symposium, perhaps during the annual meeting. Without such an easily accessible national venue, many programs will struggle to strengthen their simulation efforts. This jeopardizes the overall goal of advancing EM simulation-based education.
Accreditation Program/Setting Standards
An effective simulation program incorporates sound, evidence-based instruction, curriculum design, and curriculum implementation.18 While absolute standards of simulation excellence do not currently exist, there are clear guidelines and sound methodologies that can be incorporated into robust accreditation guidelines. Drawing on the examples outlined in this article, Table 3 proposes a model for accreditation standards for EM simulation. The proposed levels are theoretical, based on a future in which simulation-based credentialing (Level 1) becomes a standard model, as in aviation. These levels are intended to incorporate flexibility into the system, where new levels or sublevels can be added as requirements and technologies develop. Such an approach also allows new criteria to be added to existing levels as they become relevant. For instance, one could envision the need to define the research infrastructure and expertise necessary for a center to participate in multicenter simulation studies. The accreditation model could then be expanded to include research as an additional area of recognition.
Each area listed in Table 3 (instruction, curriculum, etc.) is discussed in sections above. For the base level of accreditation (Level 3), criteria are broadly defined, as the purpose of an accreditation process is to ensure that what is being taught is methodologically sound, not to be proscriptive about what or who to teach. This again illustrates the importance of the consultation service as a mechanism for helping individual programs clearly identify their population of learners and to set reasonable goals within a best practices set of guidelines.
No accreditation process, in isolation, can ensure a quality product. In fact, poorly conceived mandated standards can derail a productive, innovative program by forcing it to meet unnecessary or irrelevant requirements. The EM model outlined in Table 3 is intended as one example of a roadmap for EM programs to establish and maintain good simulation practice. As organizations further investigate the use of simulation in obtaining and maintaining certification it will be important to identify centers capable of providing more advanced training and evaluation to meet a national training standard. When combined with the consultation service and a strong instructor training program, such an accreditation process has the potential for significantly advancing simulation-based education and research in EM.
This article represents a review of the literature and explores potential avenues for simulation consultation and accreditation in EM. It is a preliminary overview and requires further exploration into the financial, personnel, and organizational components of a large-scale accreditation program. Additionally, it will be important to develop and support simulation instructor training and the consultation service in parallel. The establishment of the SAEM Simulation Academy is the first step to providing structure to these activities. This working group recommends the following next steps:
- 1 Continue to use existing SAEM infrastructure (consultation service, academies) to support simulation initiatives with colleagues across disciplines.
- 2 Build upon current simulation expertise to design and execute SAEM-sponsored simulation instructor courses.
- 3 Create a formal process whereby the SAEM board of directors and SAEM simulation stakeholders further review and consider the feasibility of an EM simulation accreditation process.