Professor of Obstetrics and Gynaecology and Clinical Epidemiology; Director of the WHO Collaborating Centre for Research Synthesis in Reproductive Health; Director of Research and Development and Honorary Consultant Obstetrician-Gynaecologist
This is the second in a series of four articles aimed at providing a theoretical accompaniment and practical guide to the Advanced Training Skills Module (ATSM) in Medical Education. The series addresses how best to deliver the module practically and to document progress in the required competences in a portfolio. Each part outlines educational principles and gives examples of how the trainee can demonstrate understanding and application of these principles in the workplace. A section on teaching opportunities focuses on identifying clinical situations in everyday practice where learning can occur. Finally, there are tips for both trainers and trainees. This series is essential reading for all trainees and trainers engaged in the ATSM and certificate, diploma and masters courses in medical education.
This article describes practical ways, supported by educational theory, of designing, understanding and using assessment, feedback and evaluation. Assessment is about measuring the changes in trainees that result from learning. This measurement can be aimed at decision making regarding progression (pass or fail) or at identifying areas for development. Evaluation examines whether the educational programme is meeting the learners' needs; looks at the teachers; ensures that assessments are fit for purpose; and influences policy and management in institutions with an educational role. Advanced Training Skills Module trainers should endeavour to equip their trainees with the knowledge and skills necessary to perform assessment, feedback, appraisal and evaluation and to fulfill the educational role expected of them on appointment to a consultant post.
Assessments need to be valid
Validity is the most important criterion of a quality assessment;1 that is, the extent to which the assessment measures what it is intended to measure. To achieve full validity many requirements have to be met (Box 1) and confidence in the assessment should increase proportionately. Reliability of assessment implies consistency of measurement performance.
(Box 1 )
[Some working definitions]
Assessment directs learning
Those elements in a curriculum that are assessed are perceived by students as important. To pass an assessment, students will strategically prioritise learning these parts over those that are not assessed2 and this will affect their future performance.1 The type of assessment method used will also direct the depth of learning. Multiple-choice questions encourage superficial learning, whereas extended matched or structured short-answer questions encourage a deeper level of learning. The deeper level of learning that guides practice is often assessed using objective structured clinical examinations and ‘on-the-job’ work-based assessments. Competences to be measured can be depicted as a learning pyramid, where the base relates to knowledge, moving through to evaluated performance at the peak (Figure 1).3
A curriculum should involve both formative and summative assessments
There are two types of assessment: summative and formative. The former occurs when a pass/fail decision is made, depending on results, whereas the latter directs future learning. Box 2 gives examples of summative and formative assessments in the obstetrics and gynaecology curriculum. Formative assessments should be performed at the beginning of any teaching and training programme, to set out aims and objectives, and then at regular intervals to review and direct learning. The importance of both types of assessment must be appreciated by the trainee.
(Box 2 )
[Examples of assessment and appraisal used in postgraduate training]
Assessment is different from appraisal
The terms ‘assessment’ and ‘appraisal’ are often erroneously used interchangeably. Formative assessment ‘informs’ learners of their progress before a later planned summative assessment using defined learning outcomes. Appraisal considers personal development as well as educational development and is not measured against any set criteria, nor does it contribute to a formal summative assessment. Appraisal is jointly developed by the trainee and trainer and should be confidential and non-threatening. Examples of personal development issues that may be discussed at appraisal are career interests and coping with workload, study and family life (Box 2).
Feedback is important for learning
Feedback transforms assessment into a tool for teaching and learning. Effective feedback is achieved when it leads students to try to narrow the gap between their current level and their goals.4,5 For this to happen the goals need to be explicit and learners need to be able and empowered to reach them. Feedback can encourage self-reflection, raise self-awareness and help students plan for future learning. The Confederation of Postgraduate Medical Education Councils (Australia) recommended supplying effective feedback to students and junior doctors as a strategy for preventing distress and enhancing clinical performance.6–8
For maximal learning potential, feedback should be structured
Ensuring good feedback requires adequate time, clear goals and outcomes, direct observation of learners and the skills of offering both positive and negative feedback. The two most widely accepted models of delivering feedback are Pendleton's rules9 and Silverman's agenda-led, outcome-based analysis.10 Both models provide a safe environment, thus reducing defensiveness and increasing constructiveness.
Pendelton's rules clarify factual issues, highlighting positives first by asking the learner to comment and then reinforcing these by a facilitator or group discussion. Suggestions as to what could have been done differently are then offered first by the learner and then by the group or individual contributing the feedback, emphasising self-reflection on positive factors as well as areas for improvement. A common criticism of this method is that it creates an artificial and rigid environment by highlighting the positives, thus reducing the opportunity for discussion of issues arising from the learner's own agenda.
In the agenda-led, outcome-based analysis method, the principle is to identify what the learner wants help with first, then to look at the outcomes which the learner is trying to achieve. The learner is then encouraged to make a self-assessment with suggestions of how to improve, before the group providing feedback makes suggestions. The discussion is directed towards strategies to achieve these goals, thus empowering the learner. Either way, it is important to stick to some basic rules if feedback is to be constructive (Box 3).
Evaluation is part of a cycle for improvement of training
Evaluation is a judgment about a training programme, training or trainer. Like clinical audit, evaluation is not an isolated activity. For it to be effective it needs to be part of a cycle of improvement. Entry can be at any part of the cycle, but the cycle must be completed. In medicine, the ultimate outcome of education must be improved patient care. Evaluation at this level would provide the strongest evidence and is used by Best Evidence in Medical Education (see Websites) as part of its systematic review of educational strategies. There are several outcome levels before this; they can be depicted as Kirkpatrick's hierarchy (Box 4).11 There are many methods of evaluation, including questionnaires such as the postgraduate hospital educational environment measure (PHEEM),12 interviews, focus groups and site visits. We will elaborate on the design of evaluation methods in the fourth article (on medical education research) in this series. The results of evaluation need to be fed back into the system to complete the cycle of improvement. Recommendations for change can then be identified and implemented, with plans for further evaluation.
(Box 4 )
[Kirkpatrick's levels of evaluation, with examples for medical education and clinical practice]
Teaching and learning opportunities: assessing the assessor
As assessment drives learning, it is important for a trainer to assess the ATSM trainee who is learning to assess others in the clinical context. The ATSM trainer's usual clinical time can be structured to include ‘assessable moments’ when the trainee can be observed carrying out a work-based assessment or giving feedback to students or other trainees. Assessable moments should not take more than 15 minutes and with foresight can easily be incorporated into any clinical session. Feedback to the trainee needs to be timely and supportive if it is to enhance future performance. A good way of structuring observation and feedback of the trainee undertaking an assessment is through a peer observation form. This type of form is used in many teaching institutions. We have devised an example of such a form, called the objective structured assessment of teaching (OSATe), to be used for formative assessment of teaching competences within medical education. This can be applied to all types of teaching or assessment undertaken by trainees. There is a section for the trainee to fill in before and after the session that is to be observed, including some self-evaluation and reflection. Figure 2 shows an example of a form that is used for feedback after a trainee has performed a work-based assessment. Other work-based tools, such as the case-based discussion, can be used to discuss teaching when the teaching event becomes the ‘case’, such as the example given in the learning cycle in Article 1 of this series.13
As well as evaluating their own teaching, ATSM trainees should be involved in evaluating other teaching programmes. This activity is akin to doing an audit for other ATSMs. They could, for example, evaluate their regional teaching programme, a local teaching programme such as a foundation year 2 programme or a course they have recently attended. A project for evaluation should be agreed between the trainer and trainee in the induction interview at the beginning of the ATSM in Medical Education. Dissemination of the results should be incorporated into the audit programme for the department, as well as being fed back to the course directors for the course that was evaluated.
If possible, the trainee should get involved in writing assessment material such as exam questions, objective structured clinical examination (OSCE) stations or suitable assessments for the teaching programme they design. In this way, they could show their understanding of the principles, taking into account the advantages and disadvantages of various assessment methods.
Through mentoring a student or junior trainee, the ATSM trainee can be observed practising appraisal skills. To build this relationship, the trainee could be assigned to be a mentor for a junior trainee at the beginning of the ATSM; an example could be a new trainee or someone who is due to take Membership exams. The mentorship could be overseen by the ATSM trainer, who could observe some of the sessions to ensure competence. Mentoring is encouraged by the Royal College of Obstetricians and Gynaecologists and advice can be accessed on the College website (see Websites).
Documenting progress in a portfolio: assessing teaching
A good way of documenting progress of competences in the medical education ATSM is by making a teaching portfolio. This can be different things to different people; what is important is that it says something about the trainee. For the portfolio to be used as a learning tool for the trainee, however, it should incorporate reflective practice. It can also function as an organised way of keeping evidence of teaching that the trainee has done, including material produced or articles read to support teaching.
Figure 3 shows a sample outline for an ATSM in Medical Education portfolio. It includes a section for the ATSM logbook and registration documents, a teaching curriculum vitae and then 12 sections based on the knowledge criteria set out by the ATSM curriculum. It is designed to be used in conjunction with the curriculum as set out in the log book. Criteria include the ability to understand principles and different methods of assessment and their advantages and disadvantages. Associated skills are to ‘mark and compile appropriate assessments of knowledge…’ Trainees could add assessment material they have written or been involved in marking as evidence of attaining the skills measured by these criteria. The associated attitude requirements involve being ‘able to assess learners…’: trainees could include work-based assessments they have performed for other trainees and the OSATe forms when their trainer has observed them doing so.
(See Box 5.) It is important to ensure that assessment methods are valid and that assessors and trainees understand the type of assessment they are involved in as well as the reason for carrying it out. The educational impact can be exploited by planning assessment methods to direct learning. Through appraisal a mentor can help in the personal development of a trainee and contribute to the creation of a good educational environment through ensuring that the foundations are in place for the trainee to feel safe; have a sense of belonging and self-esteem; and fulfil his or her maximum potential. Evaluation of any training programme is necessary for improvement and, like clinical audit, should be considered essential at both organisational and individual levels. Evaluation is different from feedback and in any training episode the fundamental importance of feedback should be demonstrated through dedicated time and structure. By using a peer observation form that includes self-evaluation and reflection, together with a teaching portfolio, the trainee can demonstrate the application of the principles of assessment, evaluation and feedback and the trainer can assess the trainee's competences.
(Box 5 )
[Tips for trainers and traineesa]
Cantillon P, Hutchinson L, Wood D, editors. ABC of Learning and Teaching in Medicine. London: BMJ Books; 2003.
Dent J, Harden R. A Practical Guide for Medical Teachers. 3rd ed. Edinburgh: Churchill Livingstone; 2009