Feasibility of an evidence-based medicine educational prescription


  • David A Feldstein,

  • Scott Mead,

  • Linda B Manwell

David A Feldstein, University of Wisconsin School of Medicine and Public Health, 310 N. Midvale Boulevard, Suite 205, Madison, Wisconsin 53705, USA. Tel: 00 1 608 265 8116; Fax: 00 1 608 263 4471; E-mail: Df2@medicine.wisc.edu

Context and setting Medical residents must learn techniques to manage and appraise the ever-expanding medical literature in order to incorporate relevant evidence into patient care. Using evidence-based medicine (EBM) is an accepted method of teaching and practising incorporating medical literature into clinical care. The Accreditation Council for Graduate Medical Education (ACGME) has included EBM as a core component of the practice-based learning and improvement competency.

Why the idea was necessary Quality tools with which to evaluate EBM competency are lacking. Many evaluate only individual components of the EBM process, and few evaluate performance during actual patient care. We evaluated the feasibility and reliability of an EBM educational prescription (EP) to enhance skills and evaluate competence.

What was done Based on a literature review and input from experts, we developed an EP to guide residents step-by step through the EBM process. We also developed an EP rating form to be used by teaching staff to evaluate resident performance. Eight internal medicine teaching staff were given a 2-hour, hands-on EBM training session and 2 hours of simulated teaching and rating activities using the EP. Over the next 4 months, EPs were assigned to residents on 2–4-week in-patient rotations. Using the EP, residents answered a clinical question and presented their results to the ward team. Teaching staff used the EP rating form to evaluate resident competence in clinical question formation, searching for evidence, evaluating the evidence and applying the evidence to the patient. Residents were also rated on overall competence and ability to teach the team. Ratings, based on a scoring rubric, included not yet competent, competent or superior in each area. End-of-rotation questionnaires assessed teaching staff and resident EP use and perceptions. Qualitative evaluation was obtained via semi-structured interviews with faculty members and resident focus groups. Two authors independently graded 20 EPs to evaluate scoring reliability (Cohen’s kappa).

Evaluation of results and impact Teaching staff used the EP with 20 residents on general internal medicine ward, subspecialty consult or subspecialty ward rotations. Residents reported completing an average of 1.4 EPs per rotation (standard deviation [SD] 1.2). Teaching staff felt adequately prepared to use the EP (mean 4.1, SD 1.1 [on a 5-point Likert scale]) and planned to continue using it for teaching (mean 4.0, SD 0.5). They reported inadequate time to have learners perform EPs, however (mean 2.1, SD 1.3). Inter-rater reliability on the 20 EPs showed substantial agreement for Searching (κ = 0.70) and Application of Evidence (κ = 0.72); moderate agreement for Overall Competence (κ = 0.57) and Evaluation of Evidence (κ = 0.44); and fair agreement for Question Formation (κ = 0.22). Residents reported that using the EP improved their EBM skills and allowed them to apply previous EBM teaching. Both teaching staff and residents reported lack of time as a barrier to performing EPs in the inpatient setting and recommended online versions of the forms. The EP is feasible to use during inpatient rotations with modest faculty training. Inter-rater reliability was generally good, but requires further assessment. The EP can be a useful method for determining residents’ EBM competence.