Moving Beyond Confidence and Competence: Educational Outcomes Research in Emergency Medicine

Authors

  • Chad Kessler MD, MHPE,

  • John H. Burton MD


  • The authors have no disclosures or conflicts of interest to report.

  • Supervising Editor: David C. Cone, MD.

This edition of the Council of Emergency Medicine Residency Directors (CORD)/Clerkship Directors in Emergency Medicine (CDEM) supplement represents the third installment of this education-focused collaboration with Academic Emergency Medicine (AEM). The consensus of the editors is that it has been a remarkable 3 years for educators in emergency medicine (EM), with the supplement serving to capture many of the efforts, interests, and research within the field. To that end, the supplement represents a valuable and maturing product to academicians within the field of EM.

The issue of maturity is a timely matter to consider for education research in EM. Specifically, the question of “So what, who cares?” merits a brief editorial. “So what?” reigns supreme in academic efforts, with the perennial pondering of the value of published works in the literature, such as the AEM CORD/CDEM supplement.

So what if residents prefer simulation? Do they actually learn more this way, and do patients benefit? So what if postgraduate learners engage in asynchronous learning? Are they able to apply what they learn when treating patients in the emergency department (ED)? So what if learners know more about a given topic after receiving a well-designed education intervention? Does this translate to better patient care? Who cares about resident attitudes or differences in site curriculums, particularly in the context of a survey? How do these attitudes and differences affect the lives and outcomes of the next ED patient? So what? Who cares?

Projects that assess physician skill acquisition and knowledge have value. This value is limited, however, and only significant as part of a larger effort of demonstrating differences in clinical patient outcomes. In this edition of the CORD/CDEM supplement, Biese et al.1 investigate the effect of an educational intervention on geriatric health. After assessing attitudes of the residents and testing their knowledge, researchers demonstrated that the number of inappropriately placed urinary catheters decreased significantly.1 Another recent study investigated the effect of education and training on the prevention of catheter-associated bloodstream infections.2 Warren et al.2 assessed knowledge acquisition and retention and then demonstrated that infection rates decreased significantly in the intervention group. As a direct result of the educational intervention, there was an estimated savings projection of $100,000 to $1,000,000.2 These studies are examples of education research that extend beyond confidence and competence, to address something more meaningful and reproducible to patients in our care.

We believe that better-trained physicians can improve the quality of care and the health of patients. We have also long assumed the correlation between educational intervention and patient benefit to be a true assumption. While this seems logical, the problem is that we are far too complacent in our assumption of an outcome. Medical education literature consistently falls short with respect to patient outcomes. “The research enterprise in medical education has been primarily focused on educational, rather than clinical, outcomes.”3,4 The “So what, who cares?” questions remain unanswered.

In one review of medical education research, only four publications among 600 medical education articles were found to assess patient clinical outcomes. The remainder of the articles, some 99%, demonstrated common education endpoints such as student satisfaction and knowledge acquisition.5

Why is there such a dearth of clinical outcomes in medical education research? In short, pursuing clinical outcomes in education research is difficult—really difficult. Inexperienced researchers in medical education lack like-minded colleagues to consult with and financial resources to support their work. While educational interventions as independent variables are clear, determining “important” clinical outcomes for assessment can be a daunting task. Traditional methodologies and background research are typically lacking for EM medical education projects. Furthermore, research networks and funding opportunities are uncommon.

Despite these adversities, the challenge of moving beyond confidence and competence cannot be ignored. Our next step in EM is clinical outcomes-based medical education research assessing critical gaps in patient safety, medical errors, and resource utilization.6 Researchers must embrace a mission to demand clinical measurement methods in education research and patient outcomes assessment when structuring educational interventions.

Having the vision to take that next leap in medical education maturation is crucial to our specialty. Educators and researchers must view patient-relevant endpoints as essential. This vision will lead to logical and predictable changes in research design and methodology. To facilitate outcomes research we must employ, and perhaps develop, valid and reliable tools for measuring relevant outcomes. Such tools and data should be free from random error and confounders to the degree that they actually measure what we purport them to measure.7 Sound methodology and structural content validity must be sought, keeping in mind reliability and generalizability. Statistical methods, such as generalizability theory, need to be further explored to deal with spurious events and confounding variables that plague clinical education research.6

Not every study can (or should) assess mortality endpoints. There are other tangible outcomes (e.g., patient satisfaction) and process measures (e.g., length of stay in the ED) that can be studied, with demonstrated or logical relevance to patient outcomes. At the health care provider level, educational training may help to improve relationships and communication. In the community or health care setting, we can ask whether our interventions reduce health care costs or resource utilization. Data on successful educational interventions can prioritize educational and institutional objectives, which can further improve practices and outcomes.8

As the relevance of education research discussions and efforts become more patient-centered, editors at journals interested in publishing these works will adapt as well. The published case series or curriculum survey from today is a “So what?” in the rejection bin tomorrow. Evolution in the publication of academic efforts predictably occurs in both the hunted and the hunters, one might opine.

Study outcomes must move beyond confidence and competence to incorporate clinical endpoints that directly address whether education-based questions and enhancements will affect patient morbidity and mortality. Academic emergency reviewers and editors will expect more and better with regards to methods and endpoints in this body of work. The maturing published work within the CORD/CDEM AEM education supplement would suggest that academicians are indeed evolving to the next level in their queries.

The authors thank Albert Vein for his thoughtful research and review of this work.

Ancillary