Medical simulation for professional development—science and practice
R Fox, Consultant Obstetrician, Maternity Unit, Taunton & Somerset NHS Trust, Taunton TA1 5DA, UK. Email email@example.com
Please cite this paper as: Fox R, Walker J. Draycott T. Medical simulation for professional development—science and practice. BJOG 2011;118 (Suppl. 3): 1–4.
From the earliest days of medical practice, when surgeons used cadavers to explore the possibilities of surgical intervention, simulation has been employed to advance the practice of health care. In the last 10 years, technological advances have allowed for a wider availability and greater realism of simulation, and this has encouraged a great expansion in its use. Simulation aims to create a virtuous cycle of professional development to improve patient outcomes. Although it seems eminently logical to believe that simulation will result in better outcomes, there is a need to test these new training interventions rigorously to be sure of their worth and to understand any limitations. The purpose of this BJOG supplement is to examine in depth several paradigms of medical simulation within maternity care and gynaecology, in different settings, looking at what can be achieved and how. In this opening review, we look at the potential use of medical simulation in broad terms and describe the types of evidence that can be employed to support its use.
Medical simulation mimics clinical care, allowing individual health professionals and teams to inculcate skills and cultures in preparation for safe and effective clinical care, all the while gaining confidence and becoming more efficient. Simulation is an educational method, not a facility or a technology. It can be as simple as an anatomical model of the perineum to learn suturing techniques, or as complex as a fully equipped training suite for high-fidelity multiprofessional team rehearsals of major trauma care (Table 1). No one simulation method covers all learning needs and they should be used together and in conjunction with other teaching modalities. Whatever the method, the core aims are to progress training, to improve clinical outcomes and to enhance the patient experience.1,2 Other benefits might include improved recruitment to our speciality,3 greater satisfaction with training, better teamworking4 and professional attitudes to safety.5 Moreover, as well as being a tool to enhance professional development, medical simulation might prove to be a means of formal assessment of skills for certification and revalidation.6
Table 1. Range of simulation techniques for training in health care
|Interview techniques||Role play, role play with avatars|
|Physical examination||Patient actors, anatomical models|
|Clinical decision making||Computer gaming|
|Surgery||Cadaver sessions, virtual reality|
|Teamwork for emergencies||High-fidelity rehearsals|
|Patient interaction for crash emergencies||Patient actors within rehearsals|
|Major incident co-ordination||Computer gaming|
As for any intervention, it is crucial that the effectiveness of medical simulation as an educational tool, or as a means of assessment, is tested rigorously. There are substantial outlays in terms of organisation, equipment and investment of participants’ time. These costs must be properly justified in terms of improvement of clinical outcomes, clinical efficiency or trainee experience. In setting the scene for this BJOG supplement, which has been prepared for a national conference on medical simulation, we examine the science and practice behind simulation.
Potential for education and training
Theoretical need for trainees
Traditionally, training has occurred at the patient’s side in clinical sessions, but the competing pressures of increasing quality standards and the need for service throughput have made this less acceptable, particularly for the most junior of trainees. Combined with restrictions on hours worked, these factors have limited the opportunity for learning, especially for the experience of practical procedures, such as laparoscopy and ultrasound.7 Simulation gives the healthcare trainee a means of developing and honing skills before caring for patients under supervision. This could include the trainee visiting the skills laboratory immediately before an operating list to attune dexterity, akin to the ‘knock-up’ before a tennis match. A virtual-reality study of repetitions of ultrasound measurements (femur length) has shown that the time performance of junior trainees improves to near-expert level in just five repetitions (C. Burden, pers. comm.). Even for more senior trainees, the simulation of skill-based clinical activity can be used to create a virtuous cycle of development: with simulation promoting efficiency, the trainer is able to allocate more real cases to the trainee, and issues encountered in clinical service can then be practised again in simulation, the proficiency all the while increasing more quickly. Moreover, all this can be carried out within a nonpressurised environment free from risk to patients and clinical time constraints. Training through simulation has been described as fun and some critics have asserted that this might detract from the learning experience. Others would counter that argument and propose that enjoyment in training has the potential to encourage engagement and sustained involvement. Medical simulation could perhaps be better thought of as ‘serious fun’.
Use for trainee appraisal
Data collated on individual performance in simulation can be made available for feedback to individual trainees to guide their own improvement. Examples could include systematic errors of measurement during simulated ultrasound, ergonomics of movement in simulated laparoscopic surgery, decision making for pre-eclampsia, nonperformance of safety checks during obstetric anaesthesia, etc.
Use for trainee assessment
If aptitude in simulation could be shown to equate with current clinical ability or to be predictive of future performance, these techniques might prove to be useful for trainee assessment, recruitment to training programmes, green-light status for direct patient care and licensing. To fulfil this role, very strict scientific testing would be required to understand clearly the boundaries between adequate and inadequate skill;6 overly narrow assumptions about what defines a good healthcare worker must be avoided.
Use after certification
Professional development does not stop with licensing, and simulation might well play an important role in continuing professional development. Surgeons might wish to practise skills for very rare operations, revitalise techniques after extended leave or develop expertise in procedures new to them. It is possible to imagine a role for simulation in revalidation, but, as with trainee assessment, problems might exist in describing precisely the criteria for the definition of ‘fitness to practice’.6
Use for developing teamwork and leadership
Emergencies often require the rapid mobilisation of an ad hoc multiprofessional team in order to avoid or limit harm.4 This applies particularly to maternity care, but also to gynaecology. Many of the emergencies are rare and difficult to predict; the first time a maternity specialist encounters eclampsia might be as the most senior doctor available.8 Simulated exercises have the potential to allow individuals to practise the management of rare emergencies within a team setting, and for local teams to analyse and adapt their own performance. One simulation study has suggested that variation in team efficiency is not determined by differences in conventional measures of knowledge, skill or attitude, suggesting perhaps that teamwork is a fourth dimension of professionalism.9 The development of teamworking might require nothing more than local teams practising together; a study of eclampsia rehearsals found no additional benefit of specific training in teamwork.4
Potential for scientific exploration
As well as offering novel approaches to training, simulation provides an opportunity to study the clinical behaviour of both individuals and teams. Audiovisual recordings of simulated clinical scenarios can be analysed to identify common errors in practice, which can be fed back into training programmes.10 It is also possible to explore the factors that do (and do not) influence aspects of teamworking, including team efficiency.11 Moreover, simulation proffers a means of testing training interventions. For the future, one could envisage random allocation trials that test clinical behaviour in simulation: different leadership styles, means of communication, number of team members, etc. Exploration of context is important; we must understand ‘how and why training programs work—and not simply whether or not they work’.12 For this type of research to be valid scientifically, there is a need to demonstrate that simulation is a true facsimile of real life and not simply a verisimilitude of health care.13 Thus far, absolute proof of this is lacking and therefore any conclusions drawn from these types of study require careful interpretation. Nevertheless, the realistic hope remains that, from such studies, we will be able to shape future training programmes.
Showing evidence of patient benefit
There is an important need to test whether medical simulation training programmes are effective (and cost-effective). Random allocation trials can be undertaken, but they do not always so easily lend themselves to educational interventions as they do to pharmaceuticals. Quasi-experimental designs are probably necessary to test the benefits and problems of medical simulation. Moreover, because training often involves the motivational abilities of the teacher, it is important to test the implementation across a broad geographical area, and not simply in the unit of the innovator, who might have exceptional personal qualities that inspire trainee improvement. To have a large impact on health care, simulation must be effective in many units, not just one. The implementation of multiprofessional team rehearsals using simulated shoulder dystocia has been associated with both improvement and deterioration in clinical outcomes (compared with historical controls).14,15
Training interventions, such as medical simulation, can be tested at three levels—evidence of progressive improvement in simulation (Level 1, L1), transfer to true-life patient care practices (compliance with national standards) (L2) and improved patient and public health (L3).16 Although L1 and L2 outcomes might provide valuable information, particularly in the development phase of a new programme, we need to be clear in our minds that an intervention that does not improve clinical outcomes is not a good intervention in patient terms.17 The task of determining the impact on patient outcomes is probably less difficult for frequent isolated procedures undertaken by individuals with few circumstance variables, such as perineal repair, than it is for rare unpredictable team-managed events, such as eclampsia.
Criteria for the success of simulation might also include some educational outcomes—efficiency of training (increased number of students on a programme, reduced time to certification, fewer students held back) and trainee satisfaction. The value of trainees achieving certification more quickly and with less stress should not be easily discounted. Success in these terms might be considered to be sufficient to justify the cost even if patient outcomes are left unaltered. Finally, as well as using science to test for quality improvement, there is an important practical need to better understand local factors that enhance and hinder the implementation of such projects.
That medical simulation is fast becoming an established part of obstetrics and gynaecology is irrefutable. That its implementation has been associated with variable outcomes is evident. It is important to improve our understanding of simulation through formal scientific study, to invest properly in those programmes that pass scrutiny and to discard those techniques that do not. Moreover, in part, we need to adapt our science to be better able to test how these methods translate into patient outcomes.
Disclosure of interests
TJD is a consultant to Limbs & Things Ltd, Bristol, UK, manufacturers of the PROMPT Birthing Simulator®. TJD is a member of the steering committee of PROMPT, a UK-based charity running training courses. He has no financial interest from this association. TJD has received payment for lectures from Ferring Pharmaceuticals Ltd, Drayton Hall, UK. JJW has received payment as an advisor to Alere Technologies, Waltham, USA. None of the authors own stock, or hold stock options, in any obstetric emergency training company. RF and JJW have nothing to disclose.
Contribution to authorship
RF and TJD developed the concept through discussion. RF authored the scope. TJD undertook the literature survey. RF and TJD authored and revised the text. JJW revised the text.
Details of ethics approval
Not applicable for this review.
TJD’s salary is funded by the Health Foundation, London, UK. JJW’s salary is funded by the HEFC.
Reviewed by Mark James, Consultant Obstetrician, Gloucester Royal Infirmary, UK and Kim Hinshaw, Consultant Obstetrician, Sunderland Royal Infirmary, UK.