CanMEDS evaluation in Canadian postgraduate training programmes: tools used and programme director satisfaction
Article first published online: 12 AUG 2008
© Blackwell Publishing Ltd 2008
Volume 42, Issue 9, pages 879–886, September 2008
How to Cite
Chou, S., Cole, G., McLaughlin, K. and Lockyer, J. (2008), CanMEDS evaluation in Canadian postgraduate training programmes: tools used and programme director satisfaction. Medical Education, 42: 879–886. doi: 10.1111/j.1365-2923.2008.03111.x
- Issue published online: 12 AUG 2008
- Article first published online: 12 AUG 2008
- Received 29 October 2007; editorial comments to authors 9 January 2008; accepted for publication 11 February 2008
- *education, medical, graduate;
- program evaluation;
- clinical competence/*standards;
- *personal satisfaction;
Context The Royal College of Physicians and Surgeons of Canada (RCPSC) CanMEDS framework is being incorporated into specialty education worldwide. However, the literature on how to evaluate trainees in the CanMEDS competencies remains sparse.
Objectives The goals of this study were to examine the assessment tools used and programme directors’ perceptions of how well they evaluate performance of the CanMEDS roles in Canadian postgraduate training programmes.
Methods We conducted a web-based survey of programme directors of RCPSC-accredited training programmes. The survey consisted of two questions. Question 1 was designed to establish which assessment tools were used to assess each of the CanMEDS roles. Question 2 was intended to assess programme directors’ perceived satisfaction with CanMEDS evaluation in their programmes.
Results A total of 149 of the eligible 280 programme directors participated in the survey. Programme directors used a variety of assessment tools to evaluate trainees in CanMEDS competencies. Programmes used more tools to evaluate the Medical Expert (mean = 4.03, standard deviation [SD] = 1.59) and Communicator (mean = 2.36, SD = 1.02) roles. Programme directors used the fewest tools for the Collaborator (mean = 1.75, SD = 1.10) and Manager (mean = 1.75, SD = 1.18) roles. More than 92% of the programmes used in-training evaluation reports to evaluate all the CanMEDS roles. Programme directors were satisfied with their evaluation of the Medical Expert role, but less so with assessment of the other CanMEDS competencies.
Conclusions This study demonstrates that Canadian postgraduate training programmes use a variety of assessment tools to evaluate the CanMEDS competencies. Programme directors are neutral or concerned about how the CanMEDS roles other than that of Medical Expert are evaluated in their programmes. Further efforts are required to establish best practice in CanMEDS evaluation.