SEARCH

SEARCH BY CITATION

Abstract

  1. Top of page
  2. Abstract
  3. Primary and Secondary Review Aims
  4. Methods
  5. Results
  6. Discussion
  7. Limitations
  8. Conclusions
  9. References

ACADEMIC EMERGENCY MEDICINE 2011; 18:60–67 © 2011 by the Society for Academic Emergency Medicine

Abstract

This review aims to summarize the current literature on the effects of direct, clinical observation of residents in emergency departments (EDs) on learners, patients, and departmental functioning. A systematic literature search was conducted in Medline and ERIC, covering the years 1980–2009. Keywords were used to identify postgraduate medical staff working in the ED; direct observation of these trainees by supervising staff; and reports of outcomes relating to Kirkpatrick’s levels of reaction, learning, behavior, and institutional change. From an initial 11,433 abstracts and titles, 193 full-text articles were retrieved for further study. Application of inclusion and exclusion criteria yielded seven that were relevant to the topic. These studies comprised a range of methods—descriptive, qualitative evaluation, cohort studies, and a cross-sectional survey. Learner reaction was very enthusiastic. Positive changes in behavior due to feedback provided during direct observation were suggested by two studies. A single study evaluated trainee’s perceptions on patient outcomes and noted that thorough assessments and improved management decisions may be at the expense of slower throughput of patients and diversion of senior staff from direct patient care. Three studies noted the resource-intensive nature of direct observation. Direct observation of clinical practice may be useful in ED education; however, further research is required to evaluate its effects.

“If musicians learned to play their instruments as physicians learn to interview patients, the procedure would consist of presenting in lectures or maybe in a demonstration or two the theory and mechanisms of the music­producing ability of the instrument and telling him to produce a melody. The instructor of course, would not be present to observe or listen to the student’s efforts, but would be satisfied with the student’s subsequent verbal report of what came out of the instrument.”

--George Engel, after visiting 70 medical schools in North America1

Emergency departments (EDs) are important places of learning for junior doctors. A recent Australian study examining intern activities suggests that in an 8-week period, on average, an intern would see more than 200 patients, perform approximately 300 procedures, and consult senior ED staff on almost 700 occasions.2 When senior ED staff are consulted, a junior doctor’s management plan is often altered.3

Traditionally, planned education for emergency medicine (EM) residents has consisted of tutorials and lectures. Even with protected teaching time (where residents are excused from clinical duties), attendance at teaching is variable due to shift work, holidays, and study leave. This leads to difficulty in ensuring uniform coverage of core topics and concerns that learners are being disengaged.4

During a clinical shift, teaching “on the floor” usually takes the form of ad hoc case presentations,5 with a resident seeing a patient independently, followed by discussion of the case with a senior clinician, either one to one6 or in a forum such as a “board round,”7 where cases are presented to a group of colleagues. However, in the author’s experience, it is difficult to obtain information on particular aspects of a resident’s interaction with a patient. Thoroughness of history and physical examination, empathy, communication skills, procedural skills, and decision-making at the bedside are all unseen, but greatly important for patient care.

The importance of these skills has recently been acknowledged by the Australasian College for Emergency Medicine (ACEM), resulting in alterations to in-training assessment forms.8 These are completed at the end of each clinical rotation (3- to 6-month period) and are used to assess a range of competencies. They include knowledge and basic skills, clinical judgment, practical skills, professional relationships and communication, ability to perform under stress and different workloads, sense of responsibility and work ethic, and various other aspects of performance.9 It would appear that assessment of specific factors, such as “interaction with patients and families,”“compassion/empathy,” and “gentleness of technique,” would be impossible without direct observation of practice.

Previous work has demonstrated that senior medical staff will often alter patient management plans determined by residents. Sacchetti et al.3 found that one in every 25 patients had a “major” change from the residents’ proposed plan of care after review of the patient by an attending emergency physician (EP). These changes included altered disposition, detection of unsuspected pathology, or a marked revision of intended treatment. Another 33% of all patients had minor changes made to their management after review by an EP.3

The difference between management plans determined by junior and senior physicians may have significant implications. Wyatt et al.10 found that actual survival of patients admitting to four Scottish hospitals with traumatic injuries was significantly higher if a patient was managed by an emergency consultant compared to those managed by a junior doctor. This was despite the patients managed by EPs having more severe injuries and a higher statistical likelihood of death.

As well as an association with improved patient outcomes, increasing seniority of ED clinical staff is associated with improved ED efficiency. This has recently been demonstrated by Harvey et al.,11 where a resident’s strike was associated with reduced ED waiting times and more rapid decisions for admission and discharge.

Direct observation is emerging as a potentially important educational technique in EM. Deficiencies in performance of clinical skills or procedures can be identified and appropriate feedback given.12 Potential benefits include provision of higher-quality care, improved patient safety, and enhanced confidence of junior medical staff. However, published surveys indicate that many EM residents are rarely observed in their day-to-day work.13

Emergency departments are often chaotic, overcrowded places, with large numbers of patients suffering undifferentiated illnesses, some with critical illness. Supervising EPs are subject to multiple distractions and are busy overseeing junior doctors and nursing staff and managing the “flow” of the department, as well as their “own” patients.

Given these constraints, it is likely that some models of clinical supervision have not been fully applied in the ED setting. Medical staff working in other less hectic environments, such as the operating room or the intensive care unit, may have different experiences of direct supervision. It may be, however, that some methods of direct supervision of junior staff—initially used in a more controlled setting—have some applicability to postgraduate education in EM.

Primary and Secondary Review Aims

  1. Top of page
  2. Abstract
  3. Primary and Secondary Review Aims
  4. Methods
  5. Results
  6. Discussion
  7. Limitations
  8. Conclusions
  9. References

The primary aim of this review was to summarize the current literature on the effects of direct clinical observation of residents in EDs on learners, patients, and departmental functioning. Secondary aims are to 1) describe the characteristics of—and learner reaction to—published models of direct observation that have been applied to postgraduate medical education in EDs, 2) identify other models of direct observation that have been applied to postgraduate medical education in selected areas of acute medical training outside of EM (anesthesia, intensive care), 3) develop recommendations for current practice of direct observation in postgraduate training in EM, and 4) suggest areas for further research in this area.

Methods

  1. Top of page
  2. Abstract
  3. Primary and Secondary Review Aims
  4. Methods
  5. Results
  6. Discussion
  7. Limitations
  8. Conclusions
  9. References

Study Design

This was a literature review study utilizing Medline and ERIC.

Search Strategy and Sources of Papers

All English-language articles published in peer-reviewed journals from 1980 to 2009 were included in the search. Table 1 presents an overview of the search terms used. From the articles generated from the search, reference lists were consulted to identify additional papers not discovered during the initial search. Titles and abstracts of all papers identified using the database searches were reviewed to ascertain whether they related to direct observation of clinical practice in the setting of EM, anesthesia, or intensive care. The full text of selected articles was retrieved whenever possible. The inclusion and exclusion criteria were then applied, leading to the final selection of articles.

Table 1.    Search Terms Used
Population/ParticipantsInterventionOutcomes
trainee*ORMini-CEX ORreact*
registrar*ORDOPS ORknowledge OR
resident*ORSDOTskill*OR
intern or interns ORobserv*ORattitud*OR
HMO*ORfeedback ORlearn*OR
hospital medical officer*ORsuperv*ORteach*OR
SHO*ORteach*ORbehav*OR
senior hospital officer*direct or bedside ORresult*OR
“physicians”ORbed-side ORorganisat*OR
“graduate medical education”ORimmediate ORorganizat*OR
“medical services”ORclinicalchang*
“medicine”“clinical experience”OR 
casualty OR“practicum supervision”OR 
emergency OR“clinical teaching health professions”OR 
emergency*OR“work experience programs”OR 
A&E OR“experiential learning”OR 
accident and emergency“graduate medical education”OR 
anaesthetics OR“internship programs”OR 
anesthesia OR“medical education”OR 
anaes*OR“competence”OR 
anes*OR“evaluation”OR 
theat*“job performance”OR 
intensive care OR“personnel evaluation”OR 
ICU OR“professional personnel” 
intensive care unit OR  
ITU  

Selection Parameters

Studies involving postgraduate medical staff working and training in the ED, ranging from interns (in their first postgraduate year) up to senior EM residents were included. To allow valid comparison with the training of EM residents, other areas of medicine with similar exposure to resuscitation and stabilization were selected. Although all areas of acute hospital medicine have some relevance to EM, this review selected postgraduate junior medical staff working in anesthesia and intensive care; doctors in these areas frequently apply knowledge and skills that closely overlap with those used in the ED.

Studies examining direct observation of medical students, specialist physicians, or postgraduate medical education in areas outside emergency, intensive care, or anesthesia were excluded. Additionally, studies examining postgraduate education of other health care professionals (e.g., nursing, physiotherapy) were excluded.

Interventions reporting the effects of a supervising physician directly observing clinical practice by junior medical staff were selected for this review. Studies of clinical supervision or other educational interventions that did not involve direct observation of practice were excluded.

Kirkpatrick’s model of educational outcomes14 was used to classify and analyze outcomes. The model describes four nonhierarchical levels of learning (Figure 1) and has been widely used in other reviews of educational interventions. Studies that reported on prevalence of direct observation but did not refer to any educational outcomes were excluded from the review.

image

Figure 1.  Kirkpatrick’s model of educational outcomes.

Download figure to PowerPoint

Included Study Designs

In the part of the review examining direct observation in the ED, all study designs were included, to obtain a comprehensive understanding of the application of direct observation in this setting. The review was restricted to studies that include outcome data in Kirkpatrick’s levels 2, 3, and 4 for those studies exploring direct observation in the operating room and intensive care unit. Although learner reaction (Kirkpatrick’s level 1) is important, it does not directly relate to learning. Therefore, it may not be sufficient to justify the application of a relatively scarce educational resource—an EP. The use of specialist physicians to observe and educate residents is unlikely to be revenue-neutral, and managers may require evidence that the additional cost leads to changes in knowledge, resident behavior, or patient results. Additionally, the expectations and reactions of EM learners—residents who have chosen a field with a broad range of presentations and acuity—may be different from those working in other clinical areas. Therefore, studies of postgraduate trainees in anesthesia and intensive care were excluded if they reported data related only to Kirkpatrick’s level 1.

Quality Assessment

For the purposes of quality assessment, the studies are separated into the following groups: 1) qualitative and descriptive/observational studies, 2) cross-sectional surveys, and 3) cohort studies.

Qualitative and Descriptive/Observational Studies.  Qualitative and descriptive/observational studies were assessed using a previously designed critical review form for qualitative studies developed by Letts et al.15 Review items include study purpose, need for the study, study design, sampling, data collection and analysis, overall rigor, and conclusions/implications.

Cross-sectional Surveys and Cohort Studies.  The quality analysis of cross-sectional surveys and cohort studies was based upon the relevant checklists from an international collaborative group, STrengthening the Reporting of OBservational studies in Epidemiology (STROBE).16,17 These checklists address specific aspects of title, abstract, introduction, methods, results, discussion, and funding sources.

Results

  1. Top of page
  2. Abstract
  3. Primary and Secondary Review Aims
  4. Methods
  5. Results
  6. Discussion
  7. Limitations
  8. Conclusions
  9. References

Data Extraction

A total of 193 full-text articles were retrieved, and the selection criteria outlined above applied. No additional original articles were identified by examination of the reference lists of the retrieved full-text articles (Figure 2). An overview of the seven articles selected as meeting all requirements for inclusion is presented in Table 2.12,18–23

image

Figure 2.  Flow diagram of search yield.

Download figure to PowerPoint

Table 2.    Overview of Articles Describing Effects of Direct Observation of Trainees
ArticleSummary of StudyObserversFeedback Provided to TraineeLearner Reaction
  1. SHO = senior house officer.

Benenson and Pollack, 200318Prospective observational study of emergency trainee performance in actual death notifications.Trained evaluators (ED-employed patient representatives involved in caring for bereaved families) used a checklist to assess trainees.Unsatisfactory performance prompted constructive feedback from supervising doctors.Not reported.
Celenza and Rogers, 200619Qualitative evaluation of the introduction of “clinical teaching shifts” to a West Australian ED.Consultant and registrar formally allocated to teaching and learning roles for a 5-hour period. Opportunistic teaching at the bedside, determined by case mix. No specific data collected during period of direct observation.Opportunistic teaching and feedback provided at the bedside, determined by case mix.Prior to commencement, learners were apprehensive. However, a positive response reported at end of program. Sessions were perceived as too long (suggested 2–3 hours).
Cydulka et al., 199612Descriptive study of introduction of formal direct observation and evaluation of trainees for up to 4 hours during a clinical shift four times each academic year.EP (either residency coordinator or faculty advisor) used checklist to assess trainee performance.After observed patient encounter, the trainee and supervisor would discuss the clinical findings and develop a treatment plan. Immediate feedback provided to trainee.Example: “I was really dreading this day … when you were coming to follow me around, but as it turned out—nervous as I was—I really enjoyed the time, attention, and teaching.”
Dorfsman and Wolfson20Program of structured direct observation of EM residents during clinical shifts in the ED.Three different EPs acted as observers over an 18-month period. Checklist filled in by observers during clinical encounter. 4- to 5-hour shifts following and directly observing ED resident. Strengths and weaknesses noted and protected time for feedback at end of session.Perceived as extremely helpful and nonthreatening. Requested opportunity for further teaching shifts. Suggested adding shifts where residents shadow an EP, seeing how they are able to efficiently manage their time.
Jouriles et al., 200221Secondary data analysis of direct observation data forms that were completed over an 8-year period.EPs used a checklist -based scoring system to assess interpersonal skills of ED residents.Not reportedPositive—”considered to be a valuable experience”
Lloyd et al., 200022Descriptive study of the the use of direct observation of SHO-patient consultations, followed by feedback from a more senior doctor.Senior ED trainees who had been specifically trained in giving feedback performed direct observation on consultations by ED senior house officers. Purpose-designed feedback chart used to record observations.Verbal feedback and copy of the feedback chart provided to SHO after consultation.SHOs felt that they had benefited from the teaching experience and welcomed the feedback they received.
Shayne et al., 200223Descriptive study of introduction of protected clinical teaching time and a bedside clinical evaluation instrument in an EM training program.Introduction of the clinical evaluation exercise (CEE)—providing a “snapshot” of a trainee’s performance in a clinical encounter. Data gathered through direct observation and recorded using a specific evaluation form. Formal written and oral feedback from supervising EP.Not reported.

Quality Assessment

The quality of the included studies varied considerably. Common study weaknesses of the descriptive and qualitative studies were the lack of an identified theoretical perspective, no mention of whether sampling occurred until redundancy was achieved, and lack of a clear decision trail. The cross-sectional survey and cohort studies were of higher quality.

Effects of Direct Observation on EM Residents, Patients, and EDs

An overview of the results for the four papers relating to the primary aim is presented in Table 3.12,18,19,21 The available evidence suggests that direct observation of clinical practice is beneficial in terms of trainees’ learning: positive changes in knowledge, skills, or attitudes were cited in all studies. However, these benefits were not demonstrated by grades in examination or testing, but were based on trainee self-report.

Table 3.    Reported Effects of Direct Observation of Clinical Practice in the ED
PaperEffects on Learning (Knowledge, Skills, Attitudes)Effects on Behavior of Emergency TraineesEffects on PatientsEffects on the ED
  1. NA = no information reported.

  2. *Negative effects.

Benenson and Pollack, 200318NAFeedback after direct observation led to improved performance in those initially assessed as “unsatisfactory.”NANA
Celenza and Rogers, 200619Trainees report learning clinical reasoning, clinical knowledge, professional, and communications skills. Obstacles identified included the busy department, interruptions, and sessions perceived as “too long.”*NAMore rapid decision-making and management decisions. Senior “second opinion.” More thorough assessment. Patients feeling better cared for. Prolonged waiting times for other patients.*Slower patient processing.* Resource-intensive (“nonclinical” use of senior staff).*
Cydulka et al., 199612Opportunity for extended discussion and teaching at the bedside. Immediate corrective advice for identified problems/deficiencies. Identification of underlying cause of “slow” or “underperforming” trainees.Increased attention to detail with regard to clinical assessment, preparation for procedures.NAResource-intensive use of EPs.*
Jouriles et al., 200221Direct observation may be useful to assess trainee’s interpersonal skills, however, needs further evaluation. Raw scores of trainees’ performance did not discriminate between those with poor or good interpersonal skills.* Standardization of these scores provided more accurate information.NANANA

Two studies report changes in behavior, as evidenced by repeated observation of practice demonstrating altered performance. Neither study evaluated whether these changes in behavior were sustained.

Only one study attempted to assess affect on patient care. The perceived effect of the program on patient care was determined from the viewpoint of the trainees and supervisors involved in the program. It was thought that a more thorough, careful assessment and plan was put in place, but at the expense of decreased efficiency (leading to prolonged waiting times for other patients). This was not supported by quantitative data. Three of the five studies highlighted the resource-intensive nature of direct observation of clinical practice—the use of senior staff to supervise and teach rather than directly deliver patient care.

Published Models of Direct Observation in the ED

Table 4 summarizes the four papers that describe characteristics of various models of direct observation in the ED.12,18,19,23 The published models mostly rely on senior EPs to supervise post-EM trainees. One study, examining death notification skills of trainees, used ED volunteers who are specifically involved in the care of bereaved families.

Table 4.    Overview of Published Models of Direct Observation
PaperInterventionData Collection
Benenson and Pollack, 200318Trained evaluators (ED-employed patient representatives involved in caring for bereaved families.Checklist filled in by observers during clinical encounter.
Celenza and Rogers, 200619Consultant and registrar formally allocated to teaching and learning roles for a 5-hour period. Opportunistic teaching at the bedside, determined by case mix. No specific data collected during period of direct observation. Learner and teacher reactions assed using questionnaire.
Cydulka et al., 199612EP (either residency coordinator or faculty advisor). After observed patient encounter, the trainee and supervisor would discuss the clinical findings and develop a treatment plan. Immediate feedback provided to trainee.Checklist filled in by observers during clinical encounter.
Shayne et al., 200223Introduction of the clinical evaluation exercise (CEE)—providing a “snapshot” of a trainee’s performance in a clinical encounter. Data gathered through direct observation, and immediate feedback provided to the trainee.Checklist filled in by observers during clinical encounter.

Three of the five studies reported the use of (and provided examples of) checklists, which were filled in by the observer during the clinical encounter. One example of such a checklist can be found in Shayne et al.23 Although some learners were initially apprehensive, all studies that measured learner reaction report an enthusiastic response to direct observation.

Direct Observation in Anesthesia and Intensive Care

A single paper was identified examining the effect of the presence of a consultant anesthesiologist on patient outcomes in the setting of emergency intubation.24 This was a prospective cohort study examining 322 consecutive patients who required emergency tracheal intubation by anesthesia trainees. Instances where a consultant anesthetist was present were compared to instances where there was no direct senior supervision. Respiratory therapists who were assisting with the intubation recorded data on patient demographics, clinical management, and immediate complications. Additional data on patient outcomes were extracted from the patient’s medical records.

There were no differences between the groups at baseline. Supervision of anesthesia trainees by a consultant anesthetist was associated with a significant decrease in the rate of immediate complications (6.1% vs. 21.7%, p < 0.0001), particularly aspiration (0.9% vs. 5.8%, p = 0.037). There were no differences in patient outcomes at 28 days. The paper does not report any information on learning experiences, changes in trainee behavior, or changes in organizational practice.

Discussion

  1. Top of page
  2. Abstract
  3. Primary and Secondary Review Aims
  4. Methods
  5. Results
  6. Discussion
  7. Limitations
  8. Conclusions
  9. References

When compared to interns and residents, attending EPs are more efficient at patient care,11 and their presence has been associated with improved patient outcomes.10 A similar association of senior staff presence with improved patient outcomes has been noted for emergency airway management by anesthesia trainees.24 Direct supervision of residents has high face validity—they are observed performing tasks in a clinical environment.

This review has identified a small number of studies that describe the effects of this direct observation educational intervention in the ED. The papers in this review comprise descriptive and observational studies, qualitative studies, one survey, and two cohort studies. No study fulfilled all the criteria in the relevant checklist for methodologic rigor. All reported learning outcomes are “positive,” based on the self-report of participants. It is a weakness of the current literature in this area that more definite information on the effects of learning through direct observation of clinical practice is lacking.

None of the studies were able to demonstrate a sustained change in behavior. This may be due to the difficulty in attributing behavior change to a single educational intervention. Additionally, it has been recognized for many years that people, once they are aware they are being observed, intentionally or unintentionally modify their behavior. This “Hawthorne effect”25 may result in trainees attempting to perform at their “best,” rather than practice at their usual level. It has been suggested that by making direct observation “routine” and repetitive, this effect may be minimized.6 Emergency clinicians have previously raised concerns about the emphasis in the medical education literature on learner reaction at the expense of the “correct” end points of quality patient care or clinical outcome measures.26

Direct observation and bedside supervision appear to have conflicting effects on the ED as a whole—individual patient care may improve due to earlier involvement of senior staff and improved decision-making, but departmental efficiency may be reduced due to the diversion of senior staff to direct clinical supervision and the decreased speed with which patients are seen. This resulting delay in care may be detrimental to patients.

Despite this, over time there may be a greater increase in efficiency and clinical skills for those trainees who are provided with higher levels of supervision and feedback. However, there is no perfect measure for these aspects of performance, and results may be confounded by multiple other factors, such as trainee motivation, didactic teaching, and clinical experiences. It remains to be seen whether an “up-front investment” in trainees results in better patient outcomes.

All studies that mentioned effects on the ED highlighted the high cost of direct observation, in terms of senior clinician time. This is at the expense of either direct patient care or other administrative tasks. When significant investment of clinician time is required for minimal proven clinical benefit, and in the current financial state of many hospitals, it is thought unlikely that managers will write a blank check to continue a program on purely educational grounds.

Future research in this area must attempt to identify clinically relevant outcomes that can be attributed to the educational intervention. This is likely to be difficult—patient outcomes are influenced by multiple health systems and clinical teams, premorbid health conditions, and other factors.27 Despite this difficulty, it is important to demonstrate positive effects on patient care or ED processes to ensure ongoing support for successful educational programs in a resource-poor environment.

Models of Direct Clinical Supervision in the ED

Direct clinical supervision has the potential to positively affect patient outcome and trainee development, especially when combined with focused feedback.28 The published models of direct clinical supervision range from checklist-based assessment and specific feedback, to a “clinical teaching shift,” which is geared toward enhancing the learning experience of trainees in a more fluid way.

All published models were positively received by trainees, although initial learner apprehension was reported. Where apprehension was noted, it quickly became replaced by appreciation for the individualized attention and excellent learning opportunities direct observation offered. None of the identified studies compared different models of direct clinical supervision, so at this stage, no recommendations can be made on a particular model; this could be a potential focus for future research.

Future Research

There is little published research on the effects of direct observation in the ED, and there are several potential avenues for further research. First, we should examine effects other than learner reactions. It is important to collect supplementary data to determine whether direct observation in the ED has a positive effect on learning, trainee behavior, patient outcomes, or ED processes. Positive results in these areas may increase the likelihood of wider adoption of this educational method. Second, comparisons of different models of direct observation (checklist-based assessment and feedback, opportunistic teaching, or a combination) are needed, to determine the optimal method of delivery. Third, comparisons between direct observation and feedback by peers and by supervisors are also needed.

Limitations

  1. Top of page
  2. Abstract
  3. Primary and Secondary Review Aims
  4. Methods
  5. Results
  6. Discussion
  7. Limitations
  8. Conclusions
  9. References

The literature search and review was performed by a single author and included only English-language articles. It is possible that some important articles were missed during the review process; however, attempts were made to minimize this by the checking of reference lists of retrieved papers use of two separate database searches (ERIC and Medline).

The search terms and inclusion criteria were strictly applied. This may have led to underinclusion of other studies that are potentially relevant to EM. However, the aim of this review was to provide information pertinent to the implementation and effects of direct observation in EM. A comprehensive review of the use of direct observation in postgraduate medical education has been published elsewhere and found that reports on validity and description of educational outcomes were scarce.29

Conclusions

  1. Top of page
  2. Abstract
  3. Primary and Secondary Review Aims
  4. Methods
  5. Results
  6. Discussion
  7. Limitations
  8. Conclusions
  9. References

This article summarizes the current literature on the effects of direct, clinical observation of residents in EDs on learners, patients, and departmental functioning. This educational method is resource-intensive, requiring senior staff to be free of a patient load and able to directly observe and supervise an intern or resident. Many papers identify positive learner reaction, which may be used to justify the introduction of such a program. Additional outcomes include a perception of more thorough clinical assessment and improved management decisions, but these are achieved at the expense of slower overall patient processing. In a health care environment where resources are insufficient to meet all clinical priorities, further research is required to determine whether such programs have positive effects on learning, ED efficiency, or—most importantly—measurable patient outcomes.

References

  1. Top of page
  2. Abstract
  3. Primary and Secondary Review Aims
  4. Methods
  5. Results
  6. Discussion
  7. Limitations
  8. Conclusions
  9. References