SEARCH

SEARCH BY CITATION

Keywords:

  • medical education;
  • bedside cardiovascular teaching

Abstract

  1. Top of page
  2. Abstract
  3. METHODS
  4. RESULTS
  5. DISCUSSION
  6. Acknowledgments
  7. REFERENCES

OBJECTIVE: To determine if structured teaching of bedside cardiac examination skills improves medical residents' examination technique and their identification of key clinical findings.

DESIGN: Firm-based single-blinded controlled trial.

SETTING: Inpatient service at a university-affiliated public teaching hospital.

PARTICIPANTS: Eighty Internal Medicine residents.

METHODS: The study assessed 2 intervention groups that received 3-hour bedside teaching sessions during their 4-week rotation using either: (1) a traditional teaching method, “demonstration and practice” (DP) (n=26) or (2) an innovative method, “collaborative discovery” (CD) (n=24). The control group received their usual ward teaching sessions (n=25). The main outcome measures were scores on examination technique and correct identification of key clinical findings on an objective structured clinical examination (OSCE).

RESULTS: All 3 groups had similar scores for both their examination technique and identification of key findings in the preintervention OSCE. After teaching, both intervention groups significantly improved their technical examination skills compared with the control group. The increase was 10% (95% confidence interval [CI] 4% to 17%) for CD versus control and 12% (95% CI 6% to 19%) for DP versus control (both P<.005) equivalent to an additional 3 to 4 examination skills being correctly performed. Improvement in key findings was limited to a 5% (95% CI 2% to 9%) increase for the CD teaching method, CD versus control P=.046, equivalent to the identification of an additional 2 key clinical findings.

CONCLUSIONS: Both programs of bedside teaching increase the technical examination skills of residents but improvements in the identification of key clinical findings were modest and only demonstrated with a new method of teaching.

The decline of physical examination skills among medical residents has been well documented both in the United States and abroad.1 Fading skills have led to poor performance of residents' examination skills2–8 and frequent diagnostic errors by trainees9–11 and attending physicians.12,13 In recognition of these problems the U.S. National Board of Examiners has recently reintroduced demonstration of competence in the physical examination as a prerequisite step before graduation. The acquisition of basic skills appears an essential first step but evidence that these skills should be nurtured and developed is provided by an intriguing study which illustrates that, even in this era of advanced diagnostic technology, accomplished bedside skills can have a dramatic effect on the care of patients.14

We conducted a study to assess the effectiveness of improving bedside cardiac examination skills. We postulated that prior interventions had produced disappointing results because trainees had poor examination techniques and/or inadequate skills to recognize key clinical findings. Deficiencies in these core skills would lead to erroneous interpretations of information and subsequent incorrect cardiac diagnoses. Our intervention was novel as it was designed to evaluate 2 distinct educational interventions that focused on teaching the correct technical skills for performing the cardiac exam, and teaching the correct identification of key cardiac findings. One approach relied on traditional bedside teaching methods15 and the other, where the role of instructors and learners were dramatically different, used principles of collaborative learning.16 Unlike other investigators, we chose to develop skills in the cardiovascular system as a whole (not just auscultation), and elected to use real patients and bedside teaching sessions in order to reflect the reality of actual patient encounters.

METHODS

  1. Top of page
  2. Abstract
  3. METHODS
  4. RESULTS
  5. DISCUSSION
  6. Acknowledgments
  7. REFERENCES

Setting

The study was implemented in the department of medicine at a university-affiliated public teaching hospital. Each resident is assigned to 1 of 3 firms at the beginning of their training. Every rotation there are 12 medicine teams to staff (4 per firm). Each inpatient team includes 1 attending physician and 4 housestaff (2 residents and 2 interns). Thus, 48 residents staff the wards each rotation. Teams rotate in 4-week blocks.

Subjects

All residents assigned to the Internal Medicine inpatient service during 2 consecutive rotations were eligible for participation. Residents “on service” for both rotations (n=16) were enrolled only once, so a total of 80 residents participated in the study. The Institutional Review Board of the hospital reviewed and approved the study protocol.

Design

We designed the study as a controlled trial with 3 groups. Each resident was assigned to a study group based on firm membership. Two study groups received different educational interventions. The control group received their usual attending teaching sessions (Fig. 1).

image

Figure 1.  Study design.

Download figure to PowerPoint

Interventions

Both intervention groups received 3 2-hour bedside teaching sessions focused on cardiac examination skills. The learning objectives were identical for both groups but the method of teaching was purposely unique to each. The 4 teachers in each intervention group planned teaching sessions with a variety of inpatients to develop examination skills and identification of key findings. The 3 sessions covered: (1) examination of the venous pulse, arterial pulse, and precordial movements, (2) heart sounds, and (3) heart murmurs. Subsequent sessions recapitulated some material from earlier sessions in a cumulative manner. Each team of 4 residents' teaching sessions was with the same teacher and replaced the normal teaching time with their ward attending. No written materials were provided to prevent contamination of the control group.

For each learning objective we established standardized examination techniques and explicit perceptual criteria17 to help identify key findings. See Table 1 for examples of the examination techniques and explicit perceptual criteria for 2 learning objectives for the second heart sound.

Table 1. Examples of Learning Objectives, Standardized Examination Techniques, and Explicit Perceptual Criteria for Key Findings Related to the Second Heart Sound
Learning ObjectiveStandardized Exam TechniqueExplicit Perceptual Criteria
Characterize S2 splitListen with stethoscope at second left intercostal space (LICS) and monitor respiratory movementSplit heard at end inspiration only (normal)
No split heard (unsplit)
Split [UPWARDS ARROW]inspiration (wide split)
Split [UPWARDS ARROW]expiration (reverse split)
Split heard in both (fixed split)
Identify a loud P2Palpate left second interspaceP2 is palpable at second LICS
Listen with stethoscope atP2 louder than A2 at second LICS
second LICS and apexP2 audible at apex

Teaching Methods

Demonstration and Practice (DP). The teaching method of DP incorporated a traditional bedside approach. One or more of the residents performed a specific part of the examination and reported findings to the group. The teacher then demonstrated standardized examination techniques and explained explicit perceptual criteria for identifying the key findings. Using this method the teacher confirmed or corrected the findings reported by the resident. All residents then practiced these techniques until they met the objectives for the session.

Collaborative Discovery (CD). The teaching method of CD applied principles and practices of collaborative learning16 to the cardiac examination. In contrast to the DP group, all residents were asked to perform a specific part of the examination, with each learner making observations before reporting any findings. The teacher then solicited reports from each resident in turn, registering all neutrally and highlighting areas of agreement and disagreement. Next, the teacher sought descriptions of the techniques and criteria used by each learner in observing and describing their findings. Then, the teacher suggested standardization of examination techniques and explanations of explicit perceptual criteria as methods of reconciling disparate observations and confirming others. All residents then reexamined the patient again using these suggested methods. The process continued as the teacher facilitated movement toward consensus in the group and met the objectives for the session. See Table 2 for a comparison of teaching methods for DP and CD.

Table 2. Comparison of Demonstration and Practice (DP) and Collaborative Discovery (CD) Teaching Methods
 DPCD
IntroductionGoals, objectives, and ground rules for teaching method explained to learners
Concepts (organization)Provide concepts in advance. For example, diagram of cardiac cycleConcepts deferred until after experience. All findings treated as “new findings”
Instructor roleAuthority. Instructor findings are “correct”Facilitator. Permits group to reach consensus
Instructor's method of assessing learner's initial exam technique/findingsObserves exam and solicits reports from 1 (or more) learner. Confirms or corrects technique/findingsObserves exam and solicits reports from all learners and registers neutrally. Highlights areas of agreement/disagreement
Instructor's method to introduce standardized exam and explicit perceptual criteriaInstructor demonstrates “correct” technique and states explicit perceptual criteriaInstructor suggests technique and explicit perceptual criteria to reduce variation and refine findings
Learner roleIndividual. Learner working directly with instructor. Presence of others immaterialCollaborative. Presence of other learners is essential
Reward for correct technique and findingsImmediate, specific positive feedbackNo individual reward. Group resolution and “discovery”
Closing of teaching sessionsSummarize topics “covered.” Encouraged to apply knowledge independent of instructorSummarize topics “discovered.” Encouraged to apply knowledge independent of group
DiagnosisGiven at the end of the session

Control Group. During the study period the control group inpatient teams participated in their regularly scheduled attending teaching rounds. There was no special emphasis on bedside teaching or instruction in the cardiac examination, but we did not measure the content or location of these teaching sessions.

Study Teachers and Teacher Training

Eight General Internists (4 for each intervention group) with 1 to 15 years of attending experience served as the study teachers. All were recognized as good teachers by the housestaff and peers, and assignment to intervention group was balanced for seniority, co-investigator status and firm membership. A comprehensive teachers' guide developed by 2 of the authors described teaching method (either DP or CD) and curriculum. Each set of teachers then received 8 hours of standardized bedside training in their assigned teaching method with direct observation and peer feedback from the other teachers in their group. All teachers underwent assessment of competency in their assigned teaching method by an independent external observer. Teachers did not participate in, or observe training of, the alternative teaching method and were all blinded to objective structured clinical examination (OSCE) assessment methods.

Assessments

The cardiovascular examination skills of the residents were assessed before and after the teaching intervention using 2 OSCE stations. Eleven outpatients served as volunteers for these stations. They had a wide range of cardiac disorders, including isolated or mixed valvular heart disease, hypertrophic cardiomyopathy, dilated cardiomyopathy, hypertensive heart disease, pulmonary hypertension, and a high-output state. Different patients were used for the pre- and posttests and none were known by the residents.

Outcome Measures

OSCE Scores. Each resident completed 2 OSCE stations (2 patients) for their pre- and postintervention evaluations. At each station the residents were assessed on 2 components of examination skills. First, their examination technique was measured by observing how they performed prespecified examination procedures (e.g., did they palpate the point of maximum impulse [PMI]?) during the 7 minutes allotted to evaluate each patient. Second, residents were given 6 minutes to write their key findings for each OSCE station, and these were compared with “gold standard” key findings. The gold standard for key findings was established by experienced Internists and cardiologists examining each patient independently on the day of the OSCE. Three independent raters (E.M., J.R., L.S.) scored the OSCE stations after they had received standardized training and achieved excellent agreement (exam technique k=0.87, key findings k>0.8). All raters were blinded to resident names and study group assignment.

Other Measurements. We measured several other factors we believed could impact the results of the OSCE scores. All residents had audiometry tests to screen for hearing defects, and completed a brief survey to assess demographic characteristics and previous cardiovascular or musical training. We also evaluated stethoscope quality based on reported acoustic performance. The residency program provided in-training exam (ITE) scores as an additional comparative measure.

Statistical Analysis

Practical constraints limited our sample size to the number of residents on the wards during the study period. Based on this fixed sample size the study had 80% power to detect a 12% difference between groups. This difference is equivalent to about 4 additional examination techniques being performed (2 per patient), or the correct identification of 4 additional key findings (2 per patient)—a clinically important difference in our judgment.

We analyzed postintervention scores for the 2 outcome skills (examination technique and correct identification of key findings) with 2 separate general linear models. For examination technique we considered postgraduate year, previous cardiovascular training, or general medical knowledge (ITE score) as potential confounders. None of these factors had an independent effect on the association between teaching method and outcomes, so were not included in the final model. For key findings we proposed the same 3 factors as potential confounders plus hearing acuity, stethoscope quality, and musical training. When tested, only stethoscope quality had an effect and was included as a covariate. Both models controlled for preintervention score as a covariate and the specific patients examined as a random effect. Differences between the 3 groups are reported as adjusted mean differences in percentage of correct items on the OSCEs. Dichotomous outcomes were measured using the χ2 test. We calculated all statistical tests using STATA, version 8 (STATA Corp, College Station, Tex). All P values are 2-sided.

RESULTS

  1. Top of page
  2. Abstract
  3. METHODS
  4. RESULTS
  5. DISCUSSION
  6. Acknowledgments
  7. REFERENCES

Participation

Out of 80 residents enrolled in the study, 75 completed both the pre- and postintervention OSCEs (94% follow-up). The missing residents did not differ from study residents on baseline characteristics and were from all 3 groups.

Baseline Performance

There were no differences in the 3 groups on any baseline measures—gender, year of training, hearing, prior cardiovascular or musical training, stethoscope quality, or ITE score. Scores on the preintervention OSCEs for examination technique and identification of key findings were similar between the 3 groups.

Postintervention Performance

Overall. Figure 2 shows that after the structured teaching both intervention groups significantly improved their technical examination skills compared with the control group. After adjusting for preintervention scores there was an increase of 12% (95% confidence interval [CI] 6% to 19%) in measured skills for the DP versus control group (P=.001). This translates into the correct performance of an additional 4 examination skills (average of 2 per patient) by each resident who received the DP teaching. The adjusted increase in the CD versus control group was 10% (95% CI 4% to 17%) (P=.004), which corresponds to each resident in the collaborative study teaching method performing an additional 3 examination skills correctly on 2 patients. The difference between the CD and DP groups was not significant.

image

Figure 2.  Mean scores (95% confidence interval) for examination technique by teaching method. Scores are adjusted for pretest scores and patients with possible values from 0% to 100% (proportion of examination techniques correctly performed).

Download figure to PowerPoint

However, the improvements in identifying key findings were more modest. There was no significant improvement for DP versus control and the adjusted increase was 5% (95% CI 2% to 9%) for CD versus control (P=.046). This difference corresponds on average to the correct reporting of 2 additional key findings (average of 1 clinical finding per patient) by each resident who received the CD teaching (Fig. 3).

image

Figure 3.  Mean scores (95% confidence interval) for identification of key findings by teaching method. Scores are adjusted for pretest scores, patients, and stethoscope, with possible values from 0% to 100% (proportion of key findings correctly identified).

Download figure to PowerPoint

Individual Examination Techniques

Residents' abilities to perform specific examination maneuvers varied. They uniformly performed poorly in currently assessing jugular verious pulse (JVP) (11% overall at baseline, 35% overall postintervention) and auscultation in the sitting position at end expiration (11%, 14%). In contrast, residents uniformly correctly identified the PMI location (93%, 99%). Residents' abilities to demonstrate assessing for a right parasternal heave (43%, 47%) and auscultation in the left lateral decubitus position (39%, 49%) were intermediate. Measuring JVP and auscultating in the left lateral decubitus position improved significantly in the DP group postintervention (P=.04 for JVP; P<0.01 for left lateral decubitus auscultation).

DISCUSSION

  1. Top of page
  2. Abstract
  3. METHODS
  4. RESULTS
  5. DISCUSSION
  6. Acknowledgments
  7. REFERENCES

We have shown that a structured program of bedside teaching of the cardiac exam does produce an improvement in the performance of the correct examination techniques, creating the optimal conditions in which to elicit and interpret key examination findings. However, our enthusiasm for this success is tempered by the relatively modest improvement in the identification of key findings. Our results show that a short intensive course will increase residents' technical skills so that they perform about 65% of maneuvers correctly (or an increase of 3 to 4 correctly performed examination techniques from their baseline). Disappointingly, this improvement does not translate into a clinically significant increase in the recognition of key findings (only residents who had received CD teaching identified, on average, an additional 2 key findings). In fact, a striking observation is that like other investigators,1,7 we have shown that residents' overall performance is poor—our study demonstrates the detection of key findings rising from only 40% to 45% of those potentially identifiable (Fig. 3). Interestingly, we found that stethoscope quality was a significant (P=.02) independent predictor of residents' score on the correct identification of key findings.

This should not come as a surprise, as perceptual interpretations are a more challenging aspect of the cardiac exam than rote performance of maneuvers.18 However, being able to identify key findings is what really matters clinically. We think our findings of inadequate skills in this area may help explain why trainees perform so poorly in diagnosing cardiac conditions using a variety of outcome measures—cardiac simulators,3,4 computer training programs,19 audiotapes,1 or real patients.8 If a resident does not recognize a loud P2 or detect a right ventricular heave, it is unlikely they will be able to diagnose pulmonary hypertension at the bedside. Perceptual interpretations of key findings have been shown to be directly influenced by the clinical context and framework the learner is considering,20,21 i.e., you see or hear what you are looking or listening for. It would have been interesting to see if providing the residents with some history or other clues at the OSCE stations would have improved the success of the teaching intervention.

There are some limitations to our study. First, this trial was not randomized as it was difficult to randomize residents already established in a firm-based structure. However, it was a controlled single-blinded study with no measurable differences between groups at baseline. We used real patients and 2 OSCE stations to measure outcomes—although standardized, using different patients meant key findings were different for the posttest analysis. Unlike using simulators or computers, utilizing actual patients (although more realistic) means it is harder to ensure uniformity and reproducibility of clinical findings. Our gold standard for key findings was a consensus of findings from experienced clinicians rather than phonograms or echo findings as visual, palpable, and acoustic phenomena were all measured. In addition, the teaching was only for 6 hours, which may have been insufficient. Finally, there was no assessment of the durability of the intervention effect.

There are some grounds for optimism. We see our study as a first step in understanding the continuum from acquiring technical skills to identifying key findings, interpreting their meaning and, ultimately, arriving at a diagnosis. Ours is the first study to show that a short bedside teaching intervention can significantly improve technical skills of residents and suggests that 1 teaching method has the potential to improve the identification of key findings. Why was the innovative CD method possibly better than traditional DP in this regard? The CD method of teaching may leave learners less susceptible to some of the pitfalls that lead to errors in identifying and interpreting findings.22 For example, CD learners might be less likely to stick with their initial impressions (anchoring heuristic) and be less vulnerable to premature closure. On the other hand, they could be more open to alternative perspectives (framing effects) and less intimidated by authority (blind obedience). These findings need to be tested in different clinical settings and in other institutions to see if they are reproducible. Clearly, more work remains to improve the teaching of pertinent physical examination skills and assure their relevance for future trainees.23,24

Acknowledgments

  1. Top of page
  2. Abstract
  3. METHODS
  4. RESULTS
  5. DISCUSSION
  6. Acknowledgments
  7. REFERENCES

We would like to thank Drs. Peter Hart, Maurice Lemon, and Craig Siegel for teaching the residents; Sharon Fung, our study nurse, for her assistance in implementing the study; and the patients and housestaff of the John H. Stroger Jr. Hospital of Cook County who participated in the study.

Financial support: This study was performed as part of a Faculty Development Program in Clinical Epidemiology and Research sponsored by the Collaborative Research Unit of John H. Stroger Jr. Hospital of Cook County i Rush Medical College.

REFERENCES

  1. Top of page
  2. Abstract
  3. METHODS
  4. RESULTS
  5. DISCUSSION
  6. Acknowledgments
  7. REFERENCES
  • 1
    Mangione S. Cardiac auscultatory skills of physicians-in-training: a comparison of three English-speaking countries. Am J Med. 2001;110: 2106.
  • 2
    Li JT. Assessment of basic physical examination skills of internal medicine residents. Acad Med. 1994;69: 2969.
  • 3
    St Clair EW, Oddone EZ, Waugh RA, Corey GR, Feussner JR. Assessing house staff diagnostic skills using a cardiology patient simulator. Ann Intern Med. 1992;117: 7516.
  • 4
    Oddone EZ, Waugh RA, Samsa G, Corey R, Feussner JR. Teaching cardiovascular examination skills: results from a randomized controlled trial. Am J Med. 1993;95: 38996.
  • 5
    Mangione S, Niemen LZ, Gracely E, Kaye D. The teaching and practice of cardiac auscultation during internal medicine and cardiology training. A nationwide survey. Ann Intern Med. 1993;119: 4754.
  • 6
    Dupras DM, Li JT. Use of an objective structured clinical examination to determine clinical competence. Acad Med. 1995;70: 102934.
  • 7
    Mangione S, Niemen LZ. Cardiac ausculatatory skills of internal medicine and family practice trainees. A comparison of diagnostic proficiency. JAMA. 1997;278: 71722.
  • 8
    Favrat B, Pecoud A, Jaussi A. Teaching cardiac auscultation to trainees in internal medicine and family practice: does it work? BMC Med Educ. 2004;4: 5.
  • 9
    Wiener S, Nathanson M. Physical examination. Frequently observed errors. JAMA. 1976;236: 8525.
  • 10
    Mangione S, Burdick WP, Peitzman SJ. Physical diagnosis skills of physicians in training: a focused assessment. Acad Emerg Med. 1995;2: 6229.
  • 11
    Wray NP, Friedland JA. Detection and correction of house staff error in physical diagnosis. JAMA. 1983;249: 10357.
  • 12
    Goetzl EJ, Cohen P, Downing E, Erat K, Jessiman AG. Quality of diagnostic examinations in a university hospital outpatient clinic. Ann Intern Med. 1973;78: 4819.
  • 13
    Paauw DS, Wenrich MD, Curtis JR, Carline JD, Ramsey PG. Ability of primary care physicians to recognize physical findings associated with HIV infection. JAMA. 1995;274: 13802.
  • 14
    Reilly BM. Physical examination in the care of medical inpatients: an observational study. Lancet. 2003;362: 11005.
  • 15
    Wilkerson L, Irby DM. Strategies for improving teaching practices: a comprehensive approach to faculty development. Acad Med. 1998;73: 38796.
  • 16
    Bruffee KA. Collaborative Learning. Higher Education, Interdependence and the Authority of Knowledge. 2nd edn. Baltimore: Johns Hopkins University Press; 1998.
  • 17
    Feinstein A. Clinical Judgment. Baltimore: Williams and Wilkins; 1967: 18; 321–49.
  • 18
    Conn RD. Cardiac ausculatatory skills of physicians-in-training: comparison of three English-speaking counties. Am J Med. 2001;111: 5057.
  • 19
    Stern DT, Mangrulkar RS, Gruppen LD, Lang AL, Grum CM, Jude RD. Using a multimedia tool to improve cardiac auscultation knowledge and skills. J Gen Intern Med. 2001;16: 7639.
  • 20
    Bordage G. Why did I miss the diagnosis? Some cognitive explanations and educational implications. Acad Med. 1999;74: S13843.
  • 21
    LeBlanc VR, Brooks LR, Norman GR. Believing is seeing: the influence of a diagnostic hypothesis on the interpretation of clinical features. Acad Med. 2002;77: S679.
  • 22
    Redelmeier DA. The cognitive psychology of missed diagnoses. Ann Intern Med. 2005;142: 11520.
  • 23
    Mangione S, Pietzman SJ. Physical diagnosis in the 1990s: art or artifact. J Gen Intern Med. 1996;11: 4903.
  • 24
    Reilly BM, Smith CA, Lucas BP. Physical examination: bewitched, bothered and bewildered. Med J Aust. 2005;182: 3756.