Examiner fatigue in communication skills objective structured clinical examinations
Version of Record online: 20 DEC 2001
Volume 35, Issue 5, pages 444–449, May 2001
How to Cite
Humphris, G. M. and Kaney, S. (2001), Examiner fatigue in communication skills objective structured clinical examinations. Medical Education, 35: 444–449. doi: 10.1046/j.1365-2923.2001.00893.x
- Issue online: 20 DEC 2001
- Version of Record online: 20 DEC 2001
- editorial comments to authors
- Clinical competence;
- cohort studies;
- education, medical, undergraduate, *standards;
- *educational measurement;
- faculty, *standards;
The assessment of undergraduates’ communication skills by means of objective structured clinical examinations (OSCEs) is a demanding task for examiners. Tiredness over the course of an examining session may introduce systematic error. In addition, unsystematic error may also be present which changes over the duration of the OSCE session.
To determine the strength of some sources of systematic and unsystematic error in the assessment of communication skills over the duration of an examination schedule.
Undergraduate first-year medical students completing their initial summative assessment of communication skills (a four-station OSCE) comprised the study population. Students from three cohorts were included (1996–98 intake). In all 3 years the OSCE was carried out identically. All stations lasted 5 minutes with a simulated patient. Students were assessed using an examiner (content expert) and a simulated-patient evaluation tool, the Liverpool Communication Skills Assessment Scale (LCSAS) and the Global Simulated-patient Rating Scale (GSPRS), respectively. Each student was assigned a time slot ranging from 1 to 24, where 1, for example, would denote that the student entered the exam first and 24 indicates the final slot for entry into the examination. The number of students who failed this exam was noted for each of the 24 time slots. A control set of marks from a communication skills written exam was also adopted for exploring a possible link with the time slot. Analysis was conducted using graphical display, covariate analysis and logistic regression.
No significant relationship was found between the schedule point that the student entered the OSCE exam and their performance. The reliability of the content expert and simulated-patient assessments was stable throughout the session.
No evidence could be found that duration of examining in a communication OSCE influenced examiners and the marks they awarded. Checks of this nature are recommended for routine inspection to confirm a lack of bias.