Research Opportunities in Simulation-based Medical Education Using Deliberate Practice

Authors

  • William C. McGaghie PhD

    1. From the Augusta Webster, MD, Office of Medical Education and Faculty Development, Northwestern University Feinberg School of Medicine (WCM), Chicago, IL.
    Search for more papers by this author

  • Presented at the 2008 Academic Emergency Medicine Consensus Conference, “The Science of Simulation in Healthcare: Defining and Developing Clinical Expertise,” Washington, DC, May 28, 2008.

Address for correspondence and reprints: William C. McGaghie, PhD; e-mail: wcmc@northwestern.edu.

Abstract

There are many opportunities for the academic emergency medicine (EM) community to engage in simulation-based educational research using deliberate practice (DP). This article begins by defining and giving examples of two key concepts: deliberate practice and mastery learning. The article proceeds to report six lessons learned from a research legacy in simulation-based medical education (SBME). It concludes by listing and amplifying 10 DP research opportunities in academic EM. A coda states that the research agenda is rich and ambitious and should focus on the goal of educating superb, expert clinicians.

This article amplifies an earlier report in Academic Emergency Medicine that sets forth an ambitious simulation research agenda in the specialty.1 The earlier report addressed the simulation research agenda as five broad categories: 1) simulation for education and training in emergency medicine (EM)—clinical experience and reflection, behavioral and team training, procedural simulation, computer screen-based simulation, and immersive environments; 2) simulation for evaluation and testing in EM; 3) special topics in EM—care process and organizational design, studying and improving performance, disaster management, and undergraduate medical education; 4) challenges in simulation-based research; and 5) future directions. The research opportunities identified in this article have a much narrower focus, i.e., the use of deliberate practice (DP) as an independent variable in the context of simulation-based education in EM.

This article reviews the following: DP, a mastery learning model, six lessons learned, and 10 DP research opportunities. It does not, however, address other important topics in simulation-based EM education, including high-stakes personnel evaluation, systems-based practice, and patient safety, which are covered in the companion articles in this thematic issue of Academic Emergency Medicine.

DP

Deliberate practice is an educational variable first described and evaluated by learning psychologist K. Anders Ericsson.2–5 DP endorses the idea that educational interventions must be strong, consistent, and sustained to promote lasting skill and knowledge attainment.6 Learners work hard when engaged in DP; it is not child’s play. DP is an evidence-based variable that works and is grounded in information processing and behavioral theories of skill acquisition and maintenance.2–5 It involves at least nine features that can contribute to simulation-based education in EM:

  • 1 Highly motivated learners with good concentration (e.g., EM residents);
  • 2 Engagement with a well-defined learning objective or task; at an
  • 3 Appropriate level of difficulty; with
  • 4 Focused, repetitive practice; that leads to
  • 5 Rigorous, precise educational measurements; that yield
  • 6 Informative feedback from educational sources (e.g., simulators, teachers); and where
  • 7 Trainees also monitor their learning experiences and correct strategies, errors, and levels of understanding, engage in more DP; and continue with
  • 8 Evaluation to reach a mastery standard; and then
  • 9 Advance to another task or unit.

The goal of DP in a simulation learning context is constant skill, knowledge, or professional improvement, not just maintenance of the status quo. Ericsson cites data from many research studies that underscore the “4/10 rule” about the development of expertise in all fields of endeavor. In brief, it takes 4 hours of DP every day for 10 years to become a “world-class” performer as an Olympic athlete, cutting-edge scientist, chess master, patient care provider, or writer.2–5 Continuous DP is also needed to maintain and improve one’s professional edge.4,5 To illustrate, despite his exceptional athletic ability, Michael Jordan shot 500 free throws every day throughout his professional basketball career.7

The educational measurement feature of DP is especially critical. Rigorous educational measurement yields high-quality, reliable data (e.g., test scores, checklist recordings, rating scale numbers).8 Reliable data have a high signal-to-noise ratio (more signal, less noise) like a “loud and clear” radio reception versus one that is “scratchy” and difficult to discern. Reliable, trustworthy data are needed to give learners error-free feedback and as a foundation for rigorous quantitative research. Reliable data are essential to permit valid decisions or inferences about learners, either individuals or teams.9,10

Several examples demonstrate the power of DP in simulation-based medical education (SBME). The first is a best evidence medical education (BEME) systematic literature review titled, “Features and uses of high-fidelity medical simulations that lead to effective learning.”11 The BEME review covers 670 journal articles, including most medical and paramedic specialties, spanning 34 years from 1969 to 2003. Imposition of inclusion and exclusion criteria left 109 journal articles for review and qualitative synthesis. The primary outcome of this review was an inventory of 10 features and uses of high-fidelity medical simulations that lead to effective learning. The two most prominent features, ordered by cited frequency, are 1) provision of feedback during learning experiences and 2) repetitive practice. Other simulation features and uses that lead to effective learning include curriculum integration, adaptability of simulation to multiple learning strategies, and individualized learning.

A second example is based on a subset of 31 journal articles reporting 32 research studies within the final BEME pool of 109 articles. The 32 studies contained enough empirical data to permit a quantitative, meta-analysis.12 The research studies were probed to address the question, “Is there an association between hours of simulation-based practice and standardized learning outcomes?” Data analysis demonstrated a highly significant achievement increase across five categories of simulation-based practice. More practice produced increasingly higher outcome gains, with a dose–response relationship between hours of simulator practice and standardized learning outcomes.

Recent individual studies reinforce the findings about DP derived from the historical BEME reviews. In a cohort study, Issenberg and colleagues13 demonstrated large within-group gains and between-group differences in cardiac auscultation skill acquisition among fourth-year medical students and internal medicine residents. The responsible intervention variable was DP on “Harvey,” a high-fidelity cardiology simulator, and was compared to clinical experience alone. Wayne and colleagues14 demonstrated a 38% improvement in advanced cardiac life support (ACLS) skill acquisition among internal medicine residents in a randomized trial with a wait-list control condition after an 8-hour simulation-based ACLS curriculum grounded in DP. The improvement was replicated by the control group after crossover.

Mastery learning model

The mastery learning model originates from educational scholarship first expressed 45 years ago,15 and in several subsequent writings.16–18 Mastery learning was introduced to medical education 30 years ago19 and is revisited in a recent publication.20 The seven features of the mastery learning model complement the DP construct. Mastery learning involves:

  • 1 Baseline (i.e., diagnostic testing);
  • 2 Clear learning objectives, sequenced as units ordered by increasing difficulty;
  • 3 Engagement in learning activities (e.g., deliberate skills practice, data interpretation, reading) focused on reaching the objectives;
  • 4 A minimum passing mastery standard (e.g., test score) for each educational unit;
  • 5 Formative testing to gauge unit completion at a preset mastery standard;
  • 6 Advancement to the next educational unit given measured achievement at or above the mastery standard; or
  • 7 Continued practice or study on an educational unit until the mastery standard is reached.

The goal of mastery learning is to ensure that all learners accomplish all educational objectives with little or no outcome variation, although the amount of time needed to reach mastery standards for a unit’s educational objectives varies among learners. The mastery learning model can include other options and details when implemented in SBME. For example, mastery learning can address learning objectives beyond skill acquisition to include knowledge gains, affective qualities like self-confidence, or features of medical professionalism. Mastery educational objectives need not be confined to procedural skills. Mastery learning insists on a standardized curriculum for all learners adaptable to individual time requirements and uniform outcomes that are assured from rigorous measurements. The outcomes are gauged against high mastery achievement standards derived empirically by experts.

Three examples of standard setting toward the goal of mastery learning using DP in SBME have been published.21–23 Each study involved one or more panels of clinical experts to set mastery achievement standards using systematic methods. General guidelines and directions for establishing achievement standards in medical education are available from several other sources.24–26

The mastery learning model may be combined with DP using simulation technology to achieve the goal of clinical skill acquisition in medical education. In the research by Wayne et al. research demonstrating acquisition of ACLS skills,27 thoracentesis skills,28 and central venous catheter placement skills29 by internal medicine residents, all of the residents achieved mastery of all skill acquisition objectives. Approximately 15%–20% of the internal medicine resident trainees needed more time beyond the minimum allocation (e.g., 8-hour ACLS curriculum) to reach mastery standards, but the extra time needed was usually less than 1 hour.

Six lessons learned

The education research on SBME using DP and the mastery learning model done at Northwestern and other centers underscores at least six important lessons about medical education program development and evaluation research on program outcomes.

  • 1 DP can be a key feature of educational programs aimed at boosting skill and knowledge acquisition among medical learners at all levels. Given thoughtful curriculum design and management, DP can be used to provide strong, integrated, and sustained training opportunities for clinical skill acquisition, mastery, and maintenance.2–6 Brief, weak, or one-shot practice sessions do not qualify as DP. DP cannot be done on the cheap. Insufficient DP is revealed in SBME research studies as group-by-occasion statistical interactions. These interactions arise in studies comparing DP and unstructured clinical education where performance improves faster in the DP group.30,31
  • 2 Robust, sensitive measures yielding reliable data that permit valid decisions or inferences about medical learners are a key building block of SBME. Unreliable educational or research data are simply useless either as feedback to learners or for research purposes. The reliable “signal” in educational data must exceed the “noise” of error by a wide margin. High data quality, judged chiefly by reliability coefficients, is essential for all assessment procedures in health professions education. Data reliability must be demonstrated routinely, not assumed, for all educational programs and for evaluation research studies on program results.
  • 3 Rater training and constant calibration are necessary for research studies that rely on observational assessment data.32–35 One cannot assume that faculty status and experience, without training and DP, are sufficient to ensure high-quality data.
  • 4 There is no correlation between medical knowledge, measured by USMLE scores, and clinical skill acquisition. This is a consistent finding in SBME research performed at Northwestern.14,27,28 The absence of an association between measures of medical knowledge and skill acquisition reinforces a research legacy that shows scholastic aptitude and professional performance are different.36
  • 5 Use of the mastery learning model depends, in part, on setting high passing standards, using rigorous methods informed by professional judgment.21–26 Educational achievement standards derived empirically are useful, fair, and defensible.
  • 6 Data from self-assessments by nonexperts are often biased and show notoriously poor relation to performance measured objectively.37–39 Reliance on self-assessment data can produce “deceptive reflection” due to confirmation bias (the human tendency to focus on evidence that supports one’s a priori beliefs).40

Ten dp research opportunities

The previous sections on DP, mastery learning, and lessons learned are groundwork for the rest of this article, an expression of 10 DP research opportunities in simulation-based education in EM. The 10 research opportunities are idiosyncratic because they are based on personal preference and experience. Other investigators would propose different priorities. However, the list that follows may serve as a point-of-departure for scholars planning a research program in the field.

1. Wither Randomized Trials? Mastery Learning! There is little need for more randomized controlled trials to evaluate the effectiveness of DP using medical simulation for clinical skill acquisition compared to other educational approaches. In short, DP with medical simulation in a mastery learning context works.27–29 The medical education and research opportunities lie in engineering mastery learning environments that boost and maintain acquisition of clinical knowledge and skill and to evaluate results rigorously.41

2. Stretch the Endpoint! Most SBME research studies evaluate outcomes at modest endpoints. Using the familiar Kirkpatrick hierarchy,42 outcomes are frequently assessed at Level 1 (Reaction, i.e., customer satisfaction), Level 2 (Learning, i.e., knowledge or attitude gains), or Level 3 (Behavior, i.e., skill improvement measured in the SBME laboratory). The field has matured to a point where outcome measurement must be stretched to Kirkpatrick Level 4 (Results, i.e., SBME learning improves patient care practices and outcomes). Several small studies have effectively linked SBME to improved quality of care.43–49 However, more rigorous research is needed to demonstrate conclusively that results from the simulation education laboratory transfer to actual patient care and to identify and study those conditions where transfer does not occur.

3. Study DP Quality. Research is needed to isolate and study DP variations, such as its conditions, intensity, duration, feedback features, and other moderator variables. DP comes in different flavors, and there are many research opportunities to sort out optimal variations on the DP theme, depending on the clinical skills and knowledge that different learners in the health professions (e.g., emergency physicians, nurses, technicians) need to acquire and maintain.

4. Research Design Features. In situations where comparative or correlational research designs are needed, investigators should use strong designs with rigorous measures and include large numbers of cases (i.e., individuals, teams) that produce generalizable results. This work must attend to established principles of medical education research design to be informative and yield robust results.50,51 One of the key, yet unexpected, findings of the BEME review11 is that most SBME research has employed weak and heterogeneous designs, flawed measures, small numbers of cases, and a host of other scientific sins that retard advancement of the field.

5. Measurement Development. There is a great need to develop and refine new measures for use in SBME and research that focuses more on the objective performance of learners than the subjective judgment of raters. Research in SBME featuring DP can only advance with the availability of rigorous measures that yield reliable data. This is the same argument made by Gagne in an American Psychologist article, “Training devices and simulators: some research issues,”52 published over 50 years ago. “Perhaps first in importance is the ubiquitous problem of how to measure complex human performances . . . the need for a fundamental solution which seeks to account for the variances attributable to machine and man, and the relationships between behavioral processes and products, is pointed up clearly by the training device.”52

Development of solid measures of acquired knowledge,53 clinical skills,54 and attitudes55 requires rigorous, systematic development and pilot testing procedures to ensure high-quality instruments.56 New clinical measures should assess response processes (e.g., radial arterial puncture for blood gas measurement, ACLS team responses to hospital “codes”) and response products (e.g., suture quality). Currently, popular observational rating instruments should receive less emphasis to measure clinical skills because their data are subject to biases that can dilute reliability and utility.34 New measures that capture haptic response data may have promise because the data come directly from the trainee, rather than being filtered through an evaluator’s sensorium.57,58

6. SBME Affective Consequences. SBME has strong affective consequences for medical learners. However, SBME affective consequences are rarely measured or studied scientifically. To illustrate, anecdotal reports suggest that sustained DP in a SBME context not only facilitates skill acquisition, but also boosts trainee morale and clinical self-efficacy.14,59 Investigators need to include such affective variables in their research designs to measure and study morale, self-efficacy, and team spirit rigorously. A sense of affective accomplishment is a key source of professional satisfaction. Perceived accomplishment may motivate individuals or teams to aim for skill and knowledge mastery, striving for excellence rather than mediocrity. Affective variables (e.g., mutual trust) are also key coordinating mechanisms in the formation and expression of teamwork competencies.60 More research on SBME affective outcomes is warranted.

7. Skill Maintenance. Meta-analysis demonstrates that professional skills decay rapidly without practice or use.61 However, several recent studies in SBME with DP demonstrate that acquired skills in ACLS62 and obstetric management of shoulder dystocia63 are remarkably robust to decay beyond a calendar year. These studies warrant replication with fresh data sets about other clinical skills, to determine if their findings are generalizable.

8. Faculty Development. Research on faculty development is essential for the efficient and effective introduction and management of SBME featuring DP into clinical curricula. This is a key message of the 2007 Colloquium on Educational Technology sponsored by the Association of American Medical Colleges (AAMC).64 Issenberg65 reinforces the centrality of faculty development by arguing that SBME best practice is a multiplicative product of 1) simulation technology devices, 2) teachers [not necessarily physicians] prepared to use the technology to maximum educational advantage, and 3) curriculum integration based on an institutional commitment to SBME. Studies of faculty development training models are needed to prepare SBME teachers for new roles and competencies.

9. UTOST Model. Scholars of behavioral research methods teach that recognized or not, our science is grounded in the five-component UTOST Model: unit (individual, team) + treatment (SBME with DP) + observation (measurements) + setting (laboratory, clinic) + time (occasions).51 Each of these components can vary in many ways. Quantitative science advances when the variations are implemented systematically, in controlled settings, permitting rigorous studies that can be replicated elsewhere. Rigorous research methods lie at the heart of advancing the DP research agenda in SBME. Hammond66 articulates a scientific moral imperative: “Every method . . . implies a methodology, expressed or not; every methodology implies a theory, expressed or not. If one chooses not to examine the methodological base of his or her work, then one chooses not to examine the theoretical context of that work, and thus becomes an unwitting technician at the mercy of implicit theories.” Research methods must be studied, selected, and used judiciously.

10. Team Science. Contemporary science is rarely done by solitary scholars working alone. Instead, team science is normative today and is the source of high productivity and widespread impact. A distillate of team science research suggests there are eight attributes of productive academic teams:67,68 shared goals, common mission; clear leadership that may rotate or change; high standards; sustained, hard work; physical proximity; ability for the team to minimize within-team status differences; ability for the team to maximize status of the team; and shared activities that breed trust. Forming and operating research teams based on these principles and evaluating academic team outcomes rigorously will advance the SBME research agenda.

Coda

There is no shortage of research opportunities in SBME research in EM featuring DP. The simulation education research agenda is rich and ambitious.69 The key is to keep focus on the goal of educating superb, expert clinicians70 and to continuously study ways to improve the enterprise.

Acknowledgments

The author is indebted to K. Anders Ericsson, James Gordon, and S. Barry Issenberg for helpful comments on an earlier draft of the manuscript.

Ancillary