• formative assessment;
  • debriefing;
  • simulation;
  • medical education;
  • formative evalution


  1. Top of page
  2. Abstract
  3. Debriefing and Formative Assessment
  4. Debriefing as Formative Assessment: the steps
  5. Conclusions
  6. References

The authors present a four-step model of debriefing as formative assessment that blends evidence and theory from education research, the social and cognitive sciences, experience drawn from conducting over 3,000 debriefings, and teaching debriefing to approximately 1,000 clinicians worldwide. The steps are to: 1) note salient performance gaps related to predetermined objectives, 2) provide feedback describing the gap, 3) investigate the basis for the gap by exploring the frames and emotions contributing to the current performance level, and 4) help close the performance gap through discussion or targeted instruction about principles and skills relevant to performance. The authors propose that the model, designed for postsimulation debriefings, can also be applied to bedside teaching in the emergency department (ED) and other clinical settings.

Emergency medicine (EM) educators seek efficient strategies to help residents and medical students address and even master the complex clinical, social, and logistical challenges of practicing medicine in the emergency department (ED). Formative assessment, the process of providing individually tailored doses of feedback to students on their performance is a concrete, effective way to provide this help. In simulation-based education, postscenario debriefing is an ideal forum for formative assessment. To describe and illustrate debriefing as an effective approach for formative assessment, this article draws on the existing evidence in the education, social, and cognitive science literature, as well as the authors’ experience in over 3,000 debriefings and in developing a curriculum used to teach debriefing to approximately 1,000 clinicians worldwide.1

Debriefing and Formative Assessment

  1. Top of page
  2. Abstract
  3. Debriefing and Formative Assessment
  4. Debriefing as Formative Assessment: the steps
  5. Conclusions
  6. References

Debriefing as formative assessment is a highly interactive process in which skills and understanding are not simply dispassionately assessed by the instructor, but in which new insights are cocreated in a dialogue between instructor and students. Debriefing and formative assessment, although their origins are very different, serve many of the same purposes. Debriefing, or after-action review, originated in the military practice of reviewing a mission for lessons that might improve the next one.2–4 Formative assessment emerged in the classroom as a way to evaluate curricula or students and provide midcourse feedback and correction.5,6


Reflecting on one’s own clinical or professional practice is a crucial step in the experiential learning process. It helps learners develop and integrate insights from direct experience into later action.7,8 After participating in a simulated case, debriefing allows clinicians to reflect. Some of the important goals and processes of debriefing or after-action review are to help participants understand, analyze, and synthesize what they thought, felt, and did during the simulation to improve future performance in similar situations. Achieving these goals usually involves a series of steps such as naming and processing emotional reactions, analyzing the social and clinical aspects of the situation, generalizing to everyday experience, and importantly, shaping future action by lessons learned.2,4,9–13 Effective debriefers are neither harshly judgmental nor falsely “non judgmental”; they neither berate students nor sugar-coat or camouflage criticisms. Rather, they provide clear, honest critiques in a way that is respectful and curious about the student‘s perspectives.14–15

Formative Assessment

Sometimes known as assessment for learning, formative assessment is contrasted to summative assessment, often characterized as assessment of learning. Three reviews on assessment,16–18 as well as a range of other empirical work19–21 provide the following insights: summative assessment is relatively infrequent, usually involves grades or formal ratings, occurs at the end of the training period, and is associated with high stakes (such as advancing or not advancing to the next stage of training, being certified or not certified). If the stakes are high, the assessment is usually standardized to ensure valid and reliable results. Summative assessment provides implicit feedback on where the student stands and may prompt changes in the students’ knowledge or behavior, especially through the process of studying for the exam. Formative assessment, in contrast, is ideally conducted separate from grades or formal ratings, occurs throughout the training period, is relatively frequent, involves lower stakes such as feedback on subtasks of a profession or skill set, and is tailored to the individual learner. Formative assessment can be “convergent” testing if a specific objective is known or “divergent,” exploring what is known.21 A key feature of formative assessment in both classroom and experiential contexts is that it provides feedback to the student with the goal of improving current performance.16

What is “formative” about formative assessment? The term “formative evaluation” entered the educational literature in the late 1960s with the work of Bloom,5 who was interested in providing students with “feedback and correctives at each stage of the teaching-learning process,” and Scriven, who applied it to ongoing improvement of education curricula.6 Formative assessment “forms” trainees in two ways: 1) it shapes skills and knowledge through feedback, and 2) it helps develop professional identity through the social interaction of learning conversations. While both instructors and students often understand the main curriculum of improving clinical skills or teamwork, the “hidden curriculum” of assessment includes implicit feedback about how well the trainee is performing a new professional role, such as being a doctor.22 Summative assessments, such as certification or graduation from medical school, grant students a formal identity as a physician. Formative assessment in learning conversations such as debriefing allows students to enact provisional or aspirational identities as a doctor. The respect or disdain instructors convey to the trainees can help form or undermine the trainee’s identity as a valued clinical colleague.22–24

Debriefing as Formative Assessment: the Role of “Cognitive Detective.”  To understand the process of conducting a formative assessment through debriefing, key findings from cognitive science, social psychology, and anthropology about how people perceive reality are useful. People make sense of the external clinical and social environment through cognitive “frames,” which are internal images of external reality.25–28 Terms for these images are myriad: “frames of reference,”“schemata,”“mental models,” etc. People engage in “sense-making” about external reality in which they actively filter, create, and apply meaning to their environment using frames.29,30

Frames shape the actions people take. Clinical frames and social or interpersonal frames play crucial roles in medical decision-making. For example, an emergency physician (EP) facing a patient with decreasing oxygen saturation will take one set of actions if she frames the signs as due to an airway obstruction and another if her working diagnosis is a tension pneumothorax. A nurse who holds the frame that reporting an error will lead to punishment is likely to report errors at a very different rate than one who believes the report will be used to improve work processes.31

The heart of debriefing as formative evaluation is to investigate the frames that underlie a performance gap. (This approach is like convergent formative assessment in that it assesses whether the trainee can achieve predetermined objectives. It is like divergent formative assessment in that it explores deeply what the trainee thinks and how he or she came to produce certain behaviors in a scenario.21) Just as Sherlock Holmes used visible clues to uncover crimes, the debriefer works backward from an observed performance gap to discover what frames (assumptions, goals, knowledge base) drove the actions contributing to that gap (Figure 1).


Figure 1.  Map of the performance gap. Frames are invisible but can be discovered through questioning; they drive trainee’s actions. Actions (including speech) are observable. The performance gap is the difference between the desired actions and the trainee’s actions during the scenario (the gap can be an increment or decrement).

Download figure to PowerPoint

Consider an EM resident serving as the team leader while caring for a simulated patient with a gunshot wound. She holds the frame, “Everyone on my team knows that a patient with a gunshot wound to the chest is at risk for a hemo- or pneumothorax.” She fails to state a management plan for the patient, even as her team members appear to pursue every task except for needle decompression or chest tube placement. The model in Figure 1 suggests that people’s actions, including those of the trauma team leader, are an inevitable result of how they frame the situation.

Mistakes are usually the result of “intendedly rational” actions.27,32 In the trauma team scenario, the EM resident calls for a chest tube but does not notice when no one repeats back her request. As the patient decompensates and eventually loses vital signs, other team members remain focused on relatively lower priority tasks: the tech cuts off the patient’s clothes, the surgeon calls for blood, and the nurse obtains intravenous access. The resident does not explicitly declare the crisis, its suspected cause, and an action plan (“I think we have a tension pneumothorax and we need to decompress the chest now.”). The resident believed that the team members knew that a hemo- or pneumothorax was the main risk, assumed that people were as concerned and vigilant about it as she was, and therefore thought that she didn’t need to explicitly state the need for a needle decompression or ask the surgeon to prepare to place a chest tube.

To conduct a formative assessment of the trainee’s assumptions and knowledge, the instructor uses inquiry to bring frames to the surface and then analyzes their impact on the trainee’s actions. Once frames have been clarified, the instructor helps the trainee craft new frames, such as “Just because the clinical situation is obvious to me doesn’t mean others see it the same way. I have to reveal my thinking, especially when I’m the team leader.” The instructor also works with the team to develop new, more effective actions for the future.

Debriefing as Formative Assessment: the steps

  1. Top of page
  2. Abstract
  3. Debriefing and Formative Assessment
  4. Debriefing as Formative Assessment: the steps
  5. Conclusions
  6. References

1. Creating a Context for Learning: How to Introduce Debriefing as Formative Assessment

Studies from the domains of education, psychotherapy, and team learning indicate that three factors help create a debriefing context friendly to the conversational probing needed for formative assessment: clarifying the debriefing process, creating psychological safety, and articulating assumptions that actively support the aspirational identities of participants as competent doctors. First, trainees need to know the rules of participation and what is expected of them. Introducing and explaining the concepts of debriefing increases both the amount and depth of participation.4 Setting up a simple learning contract—an agreement that the trainees will think deeply about their performance and that the instructor will provide feedback on it—improves adults’ collaboration and satisfaction with the learning process.33 Second, creating a psychologically safe “container” or learning environment enhances people’s willingness to examine and discuss their frames and actions critically and openly.34–35“Psychological safety” means that students perceive the environment as predictable and secure enough to describe and scrutinize their thoughts, motivations, and goals. This feeling of psychological safety is not the same as feeling comfortable. Rather, psychological safety allows participants to take interpersonal risks36 and permits instructors to explore difficult topics. Third, instructors create an environment conducive to learning when they validate the trainee’s tacit desire to be respected as an aspiring member of the profession.22–24 The instructor may do this by granting trainees the benefit of the doubt and assuming that they have the basic aptitudes, training, and ambition to succeed.19 Tenets of family psychotherapy suggest that an explicit statement by the faculty to this effect is useful but must be matched by a respectful and curious approach to students’ problems and successes during the simulation.37 At the Center for Medical Simulation, we achieve this by posting the following “basic assumption”: “We believe everyone participating in activities at the Center for Medical Simulation is intelligent, well-trained, cares about doing their best, and wants to improve.”

2. Have Objectives in Mind

Formative assessment in debriefing relies on predetermined learning objectives and/or learner-generated objectives in three ways: 1) objectives provide the desired performance level against which actual trainee performance is compared, 2) objectives allow for clear feedback that describes the gap (below or above) between actual and desired performance level. Such feedback, if specific and actionable, is one of the strongest predictors of improved performance:21,38,39 3) objectives guide the instructor to develop short (2- to 5-minute) didactic lectures describing the current evidence base or best practices.

Effective objectives are specific, will likely be observed during the scenario, and are relatively easy to assess. Ineffective objectives are vague, difficult to observe, and relatively hard to assess (see Table 1). Occasionally, student concerns will come in the form of complaints or attributions about other people in the scenario. These should not be dismissed, but rather recast to foster productive inquiry; complaints reveal what is important to students.40

Table 1.   Effective and Ineffective Objectives to Support Formative Assessment
Ineffective (Too Vague) ObjectiveEffective (Specific) Objective
Good team leadershipSomeone articulates clear clinical priorities to focus team’s effort
Effective clinical management of the caseTeam performs a needle decompression quickly and follows with chest tube placement if indicated
Trainee concern: “The surgeon was a jerk!”(Reframed by the instructor): Know ways to acknowledge and diffuse another provider’s frustration (in this case the surgeon) and refocus his or her attention

3. The Debriefing

Debriefings are rewarding and interesting and lead to higher levels of retention when trainees actively think about, analyze, and discuss what happened.41 Debriefing usually includes the following steps: a reactions phase in which trainees “blow off steam” and the instructor gets a first glimpse of what is most concerning to trainees, an analysis phase in which the instructor and trainees discuss and analyze trainees’ performance, and a summary phase in which trainees distill lessons learned for future performance. The analysis phase is the primary occasion for formative assessment, but the reactions phase is crucial to setting the stage.9–13

Reactions Phase.  The main goal of the reactions phase is to allow trainees to express their initial emotional reactions to the simulation; a secondary goal is for the instructor to provide facts underlying the simulation so that trainees are not confused about what happened. As part of a formative assessment, the reactions phase offers precious insights into what was most exciting or troubling for trainees and allows the instructor to focus on her own and learner-centered objectives. Following the tenets of adult learning theory, the instructor should weave one or two of these learner-centered topics into later conversation or address them directly.33,42 Expressions of regret or exhilaration are common and are linked to trainee’s aspirational identities as clinicians. Instructors can allay trainees’ worries by demonstrating respect for them, by sharing information about how they compared to other teams, and by showing sincere interest in trainees’ thoughts and analyses.

The Analysis Phase.  The fulcrum of the formative assessment in debriefing is the analysis phase, which includes the four steps outlined in Figure 2. The analysis phase also usually includes a discussion to generalize lessons learned to a trainee’s real-world context.13,43 In an analysis phase geared to formative assessment, instructors: 1) note salient performance gaps related to the predetermined objectives (these can be performance decrements or increments), 2) provide feedback on the gap by describing what they observed in a respectful but direct manner, 3) investigate the basis for this gap by exploring the frames (and sometimes emotions) that contributed to the current performance level, and 4) help close (or, when trainees perform above expectations, help the group learn from) the performance gap by discussing principles and skills relevant to performance in the case.


Figure 2.  Steps of formative assessment in medical education.

Download figure to PowerPoint

Consider again the example of the trauma team trying to manage the patient with a gunshot wound. Table 2 and the following text illustrate each step of the formative evaluation process. In Step 1 of the formative assessment depicted in Table 2, the instructor observes a gap between desired and actual performance. The instructor’s objective (and desired level of performance) is that trainees master the communication skills needed to organize and focus a team in the often stressful setting of trauma resuscitations. The instructor would like to see the team leader, Dr. Andre Allen (a pseudonym), declare a crisis explicitly, vocalize possible causes, and state an action plan. The actual performance she observes includes none of these, although some important clinical tasks are carried out. Next, the instructor shares her observation with the team to execute Step 2 of the formative evaluation process, providing feedback on the performance gap (failure to declare a crisis, possible causes, and an action plan). For feedback to be helpful it must be accurate and timely.21,38,39 Note that in Step 2, depicted in Table 2, the instructor expresses her concern directly, respectfully (i.e., without sarcasm, meanness, or sugar-coating) and from her first-person perspective. She then immediately moves to Step 3a, investigating the basis for the gap by eliciting the trainee’s perspective to uncover his frames (Step 3a).

Table 2.   Illustration of Four Steps of Formative Assessment in Debriefing
StepDebriefer’s ThinkingDebrieferTrainees
  1. ATLS = advanced trauma life support; GSW = gun shot wound.

1. Observe gap between desired and actual performance.Desired performance: mastery of communication strategies to organize and focus a team under stress, declare a crisis explicitly; vocalize possible causes; state an action plan; timely decompression of the chest. Actual performance: no coordinated effort by the team and no timely decompression of the chest.  
2. Provide feedback on the performance gap.I want to focus trainees’ attention on key objectives and establish clinical consequences. Clinical consequences show why lack of explicitly organizing the group matters.To team leader (Dr. Andre Allen): As the patient abruptly decompensated and then lost vital signs, Andre, I didn’t hear an explicit statement of what could be causing it (either a tension pneumothorax or massive blood loss) and what the team’s immediate priority should be. I am concerned that not stating a plan may have contributed to the delay in decompressing the chest. It looked to me like the team did not focus on that single high-priority action. 
3a. Investigate basis for the performance gap.I want to start the detective process of formative assessment; what frames drove Andre’s actions?So I was thinking an “out loud action plan” would have been helpful. I wonder what you think about this?Well, honestly, I thought everyone who has taken ATLS knows that the most likely injury to a patient with a GSW to the chest would be a pneumothorax or hemothorax from large-vessel injury, especially when I stated there were decreased breath sounds on the left side of the chest and asked for a chest tube. The rapid decompensation made a tension pneumothorax an obvious cause we could address right away.
3b. Investigate deeper into the basis for performance gap.It sounds like an “invisible driver” of Andre’s performance was that he believed the cause of the decompensation was obvious to other team members. To Andre, it seemed everyone would know what to do. By sharing evidence that other team members were not on the same page, I may start the process of closing the performance gap by provoking or disconfirming Andre’s thinking on this point.I hear you saying that a tension pneumo- or a hemothorax seemed like the obvious thing to focus on, yet as I watched the case, I saw team members working on many other tasks and not decompressing the chest. Let’s take a look at the video. (Turning to other team members after showing a short clip of video) What was on your mind? Tech: I figured Dr. Allen would take care of the pneumo, so I focused on cutting the clothes off. Surgeon: Well I was worried about a hemothorax and blood loss, so I wanted blood in the room in case we needed it. I thought Andre would needle the chest and that I should focus on getting blood ready for transfusion before preparing the chest tube. Nurse: I got busy hanging fluids and then was focused on the deteriorating vital signs. I forgot about the chest tube at that moment.
4a. Help close the gap through discussion and didactics.Through discussion: I want to highlight the contrast between Andre’s frame (“It’s obvious”) and each team-member’s focus. This is a way to start generating new frames and actions to improve performance next time.To other team members: I hear Andre and others saying the pneumo or hemo was on their minds, yet no one needled the chest or got the chest tube. What would it have taken to get you focused on these tasks?Nurse: It would have helped focus us on priorities to hear Andre say: “I think the patient has a tension pneumothorax and the first priority is we to needle the chest…” We all know about GSWs and tension pneumothorax, but in the heat of the moment it is really hard to process all the information and get things done!
4b. Help close the gap through discussion and didactics.Through discussion and establish: ? What the team knows about stress and cognitive processing ? How to establish a shared mental modelOK, you have some great ideas about why this situation was hard and suggestions about how to deal with it. A couple good ideas I heard were: 1) It’s hard to think in the heat of the moment and 2) state priorities out loud in situations like that so we’re all thinking along the same lines. Is that a fair summary?Yes, those things would have been good, but there’s another thing. Usually we get everything done smoothly and we don’t have to talk a lot. It was this situation with all the urgency and noise that made us not perform as well as usual.
4c. Help close the gap through discussion and didactics.Through didactics: I’m going to share my previously rehearsed short “lecturette” of how to organize and direct a team under stress.I’d like to say a little bit about what the literature tells us about stress and team performance. How does the event manager “defibrillate” a team and make sure the team grasps your priorities? Well, when people are under stress, you need to give clear, simple directions. That is because when autonomic arousal increases to a certain level, cognitive efficiency decreases. People do not process at a very high level when stressed. You cannot assume they will put 2 + 2 together the same way you do. Another advantage of an explicitly stated plan is that it provides the basis for a shared mental model of the problem. A shared mental model helps people all focus on the same priorities. 

When the instructor learns that the trainee thought it would be completely obvious to his colleagues that a pneumo- or hemothorax was the problem, she further investigates the performance gap (depicted in Step 3b of Table 2). The instructor hypothesizes that others were not focused on decompressing the chest (although they may well have been aware this was a top priority). She shows a video clip (where no one is doing anything related to decompressing the chest) and uses this to elicit other team-member frames and challenges the team leader’s frame that everyone knew what to do. Once the team and its leader realize that most members were focused on issues other than chest decompression, the team’s attention is turned toward closing the performance gap (Steps 4a and 4b of Table 2).

Since knowledge retention is enhanced by engagement, participating in a discussion rather than didactics is a more potent way for learners to develop prescriptions for what to do in the future. The instructor should attempt to elicit the group’s relevant knowledge and work with them to synthesize ideas for improving performance (Steps 4a and 4b of Table 2). In Step 4c, the instructor augments ideas generated by the group with evidence from the existing literature about stress and team behaviors. If knowledge deficits contributed to the performance gap, the instructor could provide brief didactics targeted to immediate learning needs. It is essential for instructors to design and pilot scenarios that link learning objectives with the expected actions in the simulation; they can also review relevant studies to be prepared with short didactic pieces like the one in Step 4c. Once the analysis phase is complete, the debriefing moves on to the summary phase.

The Summary Phase.  In accordance with Kolb’s theory of experiential learning,7 the purpose of the summary phase is to distill lessons learned from the debriefing into memorable rules-of-thumb or concepts that trainees can take with them to improve their practice. This step helps codify the insights developed in the formative assessment through the analysis phase. A simple and effective approach is to have trainees review the simulation and debriefing by asking, “What went well in that simulation? What things would you like to repeat in the future?” After allowing discussion, the next question might be, “Given another simulation like this, or a similar real situation, what would you do differently?” Health care trainees often are bashful about citing things they did well, so the instructor should be prepared to name and illustrate how the team was effective.


  1. Top of page
  2. Abstract
  3. Debriefing and Formative Assessment
  4. Debriefing as Formative Assessment: the steps
  5. Conclusions
  6. References

We have presented a model of formative assessment with an emphasis on feedback to be used in debriefing EM simulations. Effective debriefings shed light on the underlying drivers of performance, and with these drivers illuminated, instructors can target discussion and teaching to what is meaningful to learners. While the four-step approach to formative assessment is presented in the context of postsimulation debriefings, the approach can also be used in short ad hoc learning conversations immediately after a patient encounter in the ED to encourage reflection and promote deeper learning.


  1. Top of page
  2. Abstract
  3. Debriefing and Formative Assessment
  4. Debriefing as Formative Assessment: the steps
  5. Conclusions
  6. References
  • 1
    Gordon JA, Cooper J, Simon R, Raemer D, Rudolph J, Gray M. The Institute for Medical Simulation: a new resource for medical educators worldwide [abstract]. Anesth Analg. 2005; 101(Suppl 6S):S22.
  • 2
    Darling M, Parry C, Moore J. Learning in the thick of it. Harv Bus Rev. 2005; 83:8492.
  • 3
    Morrison JE, Meliza LL. Foundations of the after action review process. Special Report 42: United States Army Research Institute for the Behavioral and Social Science, 1999.
  • 4
    Dismukes RK, Smith GM. Facilitation and Debriefing in Aviation Training and Operations. Aldershot, UK: Ashgate, 2001.
  • 5
    Bloom BS. Some theoretical issues relating to education evaluation. In: TylerRW, (ed.) Educational Evaluation: New Role, New Means. Chicago, IL: University of Chicago Press, 1969: pp 2650.
  • 6
    Scriven M. The methodology of evaluation. In: TylerRW, GagneRM, ScrivenM, (eds.) Perspectives of Curriculum Evaluation. Chicago, IL: Rand McNally, 1967: pp 3983.
  • 7
    Kolb DA. Experiential Learning: Experience as the Source of Learning and Development. Englewood Cliffs, NJ: Prentice-Hall, 1984.
  • 8
    Schön D. Educating the Reflective Practitioner: Toward a New Design for Teaching and Learning in the Professions. San Francisco, CA: Jossey-Bass, 1987.
  • 9
    Lederman LC. Debriefing: toward a systematic assessment of theory and practice. Simul Gam. 1992; 23:14560.
  • 10
    Thiagarajan S. Using games for debriefing. Simul Gam. 1992; 23:16173.
  • 11
    Baker AC, Jensen PJ, Kolb DA. In conversation: transforming experience into learning. Simul Gam. 1997; 28:612.
  • 12
    Raphael B, Wilson JP. Psychological Debriefing. Cambridge: Cambridge University Press, 2000.
  • 13
    Fanning RM, Gaba DM. The role of debriefing in simulation-based learning. Simul Healthc. 2007; 2:11525.
  • 14
    Rudolph JW, Simon R, Dufresne RL, Raemer DB. There’s no such thing as a “non-judgmental” debriefing: a theory and method for debriefing with good judgment. Simul Healthc. 2006; 1:4955.
  • 15
    Rudolph JW, Simon R, Rivard P, Dufresne RL, Raemer DB. Debriefing with good judgment: combining rigorous feedback with genuine inquiry. Anesth Clin. 2007; 25:36176.
  • 16
    Black P, William D. Assessment and classroom learning. Assess Educ. 1998; 5:774.
  • 17
    Crooks TJ. The impact of classroom evaluation practices on students. Rev Educ Res. 1988; 58:43881.
  • 18
    Natriello G. The impact of evaluation processes on students. Educ Psychol. 1980; 22:15575.
  • 19
    Torrance H, Pryor J. Investigating Formative Assessment: Teaching Learning and Assessment in the Classroom. Maidenhead, UK: Open University Press, 1998.
  • 20
    Harlen W, James M. Assessment and learning: differences and relationships between formative and summative assessment. Assess Educ: Principles, Policy and Practice 1997; 4:36577.
  • 21
    Hattie J, Jaeger R. Assessment and classroom learning: a deductive approach. Assess Educ: Principles, Policy and Practice 1998; 5:11125.
  • 22
    Pryor J, Crossouard B. A socio-cultural theorisation of formative assessment. Oxf Rev Educ. 2008; 34:120.
  • 23
    Ibarra H. Provisional selves: experimenting with image and identity in professional adaptation. Admin Sci Q. 1999; 44:76491.
  • 24
    Foldy EG. Being all that you can be: identities and interactions in organizations. In: Proceedings from Academy of Management. Seattle, WA, August 3–6, 2003.
  • 25
    Bartunek JM. Changing interpretive schemes and organizational restructuring: the example of a religious order. Admin Sci Q. 1984; 29:35572.
  • 26
    Gentner D, Stevens AL. Mental Models. Hillsdale, NJ: Lawrence Erlbaum Associates, 1983.
  • 27
    Argyris C, Putnam R, Smith DM. Action Science: Concepts, Methods and Skills for Research and Intervention. San Francisco, CA: Jossey-Bass, 1985.
  • 28
    Watzlawick P, Weakland JH, Fisch R. Change: Principles of Problem Formation and Problem Resolution. New York, NY: Horton, 1974.
  • 29
    Weick KE. Sensemaking in Organizations. Thousand Oaks, CA: Sage Publications, 1995.
  • 30
    Weick KE, Sutcliffe K, Obstfeld D. Organizing and the process of sensemaking. Org Sci. 2005; 16:40921.
  • 31
    Edmondson AE. Learning from mistakes is easier said than done: group and organizational influences on the detection and correction of human error. J Appl Behav Sci. 1996; 32:528.
  • 32
    Snook SA. Friendly Fire: The Accidental Shootdown of U.S. Black Hawks Over Northern Iraq. Princeton, NJ: Princeton University Press, 2000.
  • 33
    Knowles MS, Holton EF III, Swanson RA. The Adult Learner. The Definitive Classic in Adult Education and Human Resource Development. 6th ed. Burlington, MA: Elsevier, 2005.
  • 34
    Billow RM. Relational Psychotherapy: From Basic Concepts to Passion. London, UK: Jessica Kingsley Publishers, 2003.
  • 35
    Edmondson A. Psychological safety and learning behavior in work teams. Admin Sci Q. 1999; 44:35083.
  • 36
    Schnarch DM. Constructing the Sexual Crucible: An Integration of Sexual and Marital Therapy. New York, NY: WW Norton & Co., 1991.
  • 37
    Satir V. Conjoint Family Therapy, 3rd ed. Palo Alto, CA: Science and Behavior Books, 1983.
  • 38
    Issenberg BS, McGaghie WM, Petrusa ER, Gordon DL, Scalese RJ. Features and uses of high-fidelity medical simulation that lead to effective learning: a BEME systematic review. Med Teach. 2005; 27:1028.
  • 39
    Van De Ridder MJM, Stokking KM, McGaghie WM, Ten Cate OTJ. What is feedback in clinical education? Med Educ. 2008; 42:18997.
  • 40
    Kegan R, Lahey LL. How The Way We Talk Can Change The Way We Work. San Francisco, CA: Jossey-Bass, 2001.
  • 41
    Dale E. Audiovisual methods in teaching, 3rd ed. Orlando, FL: Holt, Rinehart and Winston, Inc., 1969.
  • 42
    Merriam SB. Andragogy and self-directed learning: pillars of adult learning theory. New Direc Adult Contin Educ. 2001; 89:314.
  • 43
    Rudolph JW, Simon R, Raemer DB, Cooper JB. Debriefing teamwork skills in high-fidelity simulation. In: DaRosaDA, (ed.) A Guidebook for Program Directors: The ACS/APDS Surgical Skills Curriculum: Phase III. Chicago, IL: American College of Surgeons, 2008.