Medical Education in Review
How much evidence does it take? A cumulative meta-analysis of outcomes of simulation-based education
Article first published online: 9 JUL 2014
© 2014 John Wiley & Sons Ltd
Volume 48, Issue 8, pages 750–760, August 2014
How to Cite
Medical Education 2014; 48: 750–760 doi: 10.1111/medu.12473
- Issue published online: 9 JUL 2014
- Article first published online: 9 JUL 2014
- Manuscript Accepted: 17 FEB 2014
- Manuscript Revised: 15 FEB 2014
- Manuscript Received: 23 DEC 2013
- Intramural funds
- Division of General Internal Medicine, Mayo Clinic
Studies that investigate research questions that have already been resolved represent a waste of resources. However, the failure to collect sufficient evidence to resolve a given question results in ambiguity.
The present study was conducted to reanalyse the results of a meta-analysis of simulation-based education (SBE) to determine: (i) whether researchers continue to replicate research studies after the answer to a research question has become known, and (ii) whether researchers perform enough replications to definitively answer important questions.
A systematic search of multiple databases to May 2011 was conducted to identify original research evaluating SBE for health professionals in comparison with no intervention or any active intervention, using skill outcomes. Data were extracted by reviewers working in duplicate. Data synthesis involved a cumulative meta-analysis to illuminate patterns of evidence by sequentially adding studies according to a variable of interest (e.g. publication year) and re-calculating the pooled effect size with each addition. Cumulative meta-analysis by publication year was applied to 592 comparative studies using several thresholds of ‘sufficiency’, including: statistical significance; stable effect size classification and magnitude (Hedges’ g ± 0.1), and precise estimates (confidence intervals of less than ± 0.2).
Among studies that compared the outcomes of SBE with those of no intervention, evidence supporting a favourable effect of SBE on skills existed as early as 1973 (one publication) and further evidence confirmed a quantitatively large effect of SBE by 1997 (28 studies). Since then, a further 404 studies were published. Among studies comparing SBE with non-simulation instruction, the effect initially favoured non-simulation training, but the addition of a third study in 1997 brought the pooled effect to slightly favour simulation, and by 2004 (14 studies) this effect was statistically significant (p < 0.05) and the magnitude had stabilised (small effect). A further 37 studies were published after 2004. By contrast, evidence from studies evaluating repetition continued to show borderline statistical significance and wide confidence intervals in 2011.
Some replication is necessary to obtain stable estimates of effect and to explore different contexts, but the number of studies of SBE often exceeds the minimum number of replications required.