Presented at the Society for Academic Emergency Medicine annual meeting, New Orleans, LA, May 2009.
Assessing Clinical Reasoning Skills in Scenarios of Uncertainty: Convergent Validity for a Script Concordance Test in an Emergency Medicine Clerkship and Residency
Article first published online: 15 JUN 2011
© 2011 by the Society for Academic Emergency Medicine
Academic Emergency Medicine
Volume 18, Issue 6, pages 627–634, June 2011
How to Cite
Humbert, A. J., Besinger, B. and Miech, E. J. (2011), Assessing Clinical Reasoning Skills in Scenarios of Uncertainty: Convergent Validity for a Script Concordance Test in an Emergency Medicine Clerkship and Residency. Academic Emergency Medicine, 18: 627–634. doi: 10.1111/j.1553-2712.2011.01084.x
The authors have no relevant financial information or potential conflicts of interest to disclose.
The full SCT-EM is available upon written request from the first author and may be used without charge for educational (i.e., noncommercial) purposes.
Supervising Editor: Gary M. Gaddis, MD.
- Issue published online: 15 JUN 2011
- Article first published online: 15 JUN 2011
- Received September 7, 2010; revisions received November 28 and December 21, 2010; accepted December 25, 2010.
ACADEMIC EMERGENCY MEDICINE 2011; 18:627–634 © 2011 by the Society for Academic Emergency Medicine
Objectives: The Script Concordance Test (SCT) is a new method of assessing clinical reasoning in the face of uncertainty. An SCT item consists of a short clinical vignette followed by an additional piece of information and asks how this new information affects the learner’s decision regarding a possible diagnosis, investigational study, or therapy. Scoring is based on the item responses of a panel of experts in the field. This study attempts to provide additional validity evidence in the realm of emergency medicine (EM).
Methods: This observational study examined the performance of medical students, EM residents, and expert emergency physicians (EPs) on an SCT in the area of general EM (SCT-EM) at one of the largest medical schools in the United States. The 59-item SCT-EM was developed for a fourth-year required clerkship in EM. The results on the SCT-EM were compared between different levels of clinical experience. Results were also compared to performance on other measures to evaluate convergent validity.
Results: The SCT-EM was given to 314 fourth-year medical students (MS4), 40 EM residents, and 13 EPs during the study period. Mean differences between the three different groups of test takers was statistically significant (p < 0.0001). The range of scores for the MS4s was 42% to 77% and followed a normal distribution. Among the residents, performance on the SCT-EM and the EM in-training examination were significantly correlated (r = 0.69, p < 0.001); among the MS4s who later matched into EM residency programs, performance on the SCT-EM and United States Medical Licensing Examination (USMLE) Step 2-Clinical Knowledge (Step 2-CK) exam was also significantly correlated (r = 0.56, p < 0.001).
Conclusions: The SCT-EM shows promise as an assessment that can be used to measure clinical reasoning skills in the face of uncertainty. Future research will compare performance on the SCT to other measures of clinical reasoning abilities.