SEARCH

SEARCH BY CITATION

Context and setting

  1. Top of page
  2. Context and setting
  3. Why the idea was necessary
  4. What was done
  5. Evaluation of results and impact

Learning physical examination is an essential facet of undergraduate medical education and how doctors examine a patient is often influenced more by tradition than by evidence, depending in large part on the medical education and clinical experience of the teacher. We advocate that undergraduate medical students should learn physical examination in order to reflect on the evidence base. Encouraging and teaching students to scrutinise the best available evidence is an important element of medical education.

Why the idea was necessary

  1. Top of page
  2. Context and setting
  3. Why the idea was necessary
  4. What was done
  5. Evaluation of results and impact

In addition to the best evidence, good doctors also use individual clinical expertise. We already know that many hours of practice are necessary to become an expert. The student who is well prepared for physical examination assessment will have seen, experienced and learned more medicine through practice on a high volume of cases. Furthermore, it has been shown that deliberate practice of a task is a far better method of acquiring expertise than simple unstructured practice.

What was done

  1. Top of page
  2. Context and setting
  3. Why the idea was necessary
  4. What was done
  5. Evaluation of results and impact

With this in mind we conceived the Physical Examination Audit Tool (PEAT). This is a checklist of the stages of an ‘ideal’ physical examination, which is written up by each student using the best available evidence and used to audit performance. This has the manifold benefits of empowering students to learn, promoting a deeper (evidence-based) understanding of the physical examination and facilitating structured practice.

Evaluation of results and impact

  1. Top of page
  2. Context and setting
  3. Why the idea was necessary
  4. What was done
  5. Evaluation of results and impact

We conducted a qualitative study of final-year medical students at Bristol University who adopted this method in their preparation for final examinations. Data were collected using interviews and questionnaires, and analysed using grounded theory. Full ethical approval was granted and each participant gave fully informed consent.

The PEAT was introduced to a population of 16 students during a tutorial in week 1 of an 8-week attachment and an evidence-based clinical skills textbook was recommended as a starting point for reviewing evidence. Each student was allocated a junior doctor mentor to help with problems encountered while writing his or her own PEAT. Each student was encouraged to practise physical examination on as many patients as possible, each time using the PEAT to make note of procedures forgotten or performed incorrectly.

Data analysis identified two core shared themes. Firstly, it was agreed by 75% of PEAT users that PEAT encouraged them to bring evidence-based enquiry into their examination technique. Secondly, all candidates who used PEAT made comments relating to a theme of ‘improved technique’: 50% of candidates commented that it helped to identify areas frequently missed in the examination and one candidate commented that it helped him become ‘more thorough, efficient and structured’. Of note, no respondents highlighted the same areas in which their practice changed, which implies that this approach is well tailored to individual student needs.

In conclusion, this method empowers the student to scrutinise accepted checklists, promotes self-directed, structured practice and highlights areas of weakness through the self-audit of performance.