Assessment of patient management skills and clinical skills of practising doctors using computer-based case simulations and standardised patients
Article first published online: 24 AUG 2004
Volume 38, Issue 9, pages 958–968, September 2004
How to Cite
Hawkins, R., MacKrell Gaglione, M., LaDuca, T., Leung, C., Sample, L., Gliva-McConvey, G., Liston, W., De Champlain, A. and Ciccone, A. (2004), Assessment of patient management skills and clinical skills of practising doctors using computer-based case simulations and standardised patients. Medical Education, 38: 958–968. doi: 10.1111/j.1365-2929.2004.01907.x
- Issue published online: 24 AUG 2004
- Article first published online: 24 AUG 2004
- Received 7 August 2002; editorial comments to authors 23 September 2002, 25 February 2003, 14 August 2003; accepted for publication 19 September 2003
- physician patient relations;
- clinical competence/*standards;
- educational measurement
Context Standardised assessments of practising doctors are receiving growing support, but theoretical and logistical issues pose serious obstacles.
Objectives To obtain reference performance levels from experienced doctors on computer-based case simulation (CCS) and standardised patient-based (SP) methods, and to evaluate the utility of these methods in diagnostic assessment.
Setting and Participants The study was carried out at a military tertiary care facility and involved 54 residents and credentialed staff from the emergency medicine, general surgery and internal medicine departments.
Main outcome measures Doctors completed 8 CCS and 8 SP cases targeted at doctors entering the profession. Standardised patient performances were compared to archived Year 4 medical student data.
Results While staff doctors and residents performed well on both CCS and SP cases, a wide range of scores was exhibited on all cases. There were no significant differences between the scores of participants from differing specialties or of varying experience. Among participants who completed both CCS and SP testing (n = 44), a moderate positive correlation between CCS and SP checklist scores was observed. There was a negative correlation between doctor experience and SP checklist scores. Whereas the time students spent with SPs varied little with clinical task, doctors appeared to spend more time on communication/counselling cases than on cases involving acute/chronic medical problems.
Conclusion Computer-based case simulations and standardised patient-based assessments may be useful as part of a multimodal programme to evaluate practising doctors. Additional study is needed on SP standard setting and scoring methods. Establishing empirical likelihoods for a range of performances on assessments of this character should receive priority.