This study was presented at the annual meeting of the Society for Academic Emergency Medicine, New Orleans, LA, May 2009, and the annual meeting of the Pediatric Academic Societies, Baltimore MD, May 2009.
Assessment of a Training Curriculum for Emergency Ultrasound for Pediatric Soft Tissue Infections
Article first published online: 11 FEB 2011
© 2011 by the Society for Academic Emergency Medicine
Academic Emergency Medicine
Volume 18, Issue 2, pages 174–182, February 2011
How to Cite
Marin, J. R., Alpern, E. R., Panebianco, N. L. and Dean, A. J. (2011), Assessment of a Training Curriculum for Emergency Ultrasound for Pediatric Soft Tissue Infections. Academic Emergency Medicine, 18: 174–182. doi: 10.1111/j.1553-2712.2010.00990.x
This study was funded by the Nicholas Crognale Chair for Emergency Medicine, The Children’s Hospital of Philadelphia, Philadelphia, PA. We are grateful to Sonosite, Inc. for use of their ultrasound machine for the duration of the study.
Supervising Editor: Rob Reardon, MD.
- Issue published online: 11 FEB 2011
- Article first published online: 11 FEB 2011
- Received August 1, 2010; revision received October 22, 2010; accepted November 3, 2010.
ACADEMIC EMERGENCY MEDICINE 2011; 18:174–182 © 2011 by the Society for Academic Emergency Medicine
Objectives: The objective was to evaluate a training protocol for pediatric emergency physicians (EPs) learning emergency ultrasound (EUS) for the evaluation of skin and soft tissue infections (SSTIs) by assessing technical ability and interrater reliability.
Methods: Pediatric emergency medicine (EM) fellows and attending physicians completed a 1-day training course taught by an expert emergency sonologist. After the course, EPs performed proctored examinations on patients with SSTIs until they reached predefined performance criteria, after which they performed independent EUS examinations. All EUS examinations were recorded using still images and video clips that were reviewed and rated by the expert sonologist on four technical measures and combined into a composite score. The expert’s opinion regarding the presence or absence of an abscess was also compared to the study sonologist’s opinion and analyzed for interrater reliability.
Results: Seven EPs performed 107 EUS examinations. The mean (±SD) composite score for the evaluation of technical ability for the first EUS was 3.3 ± 0.14 (on a 4-point scale), indicating a high level of quality following the training course. There was a small amount of improvement in the quality score (0.015, 95% confidence interval [CI] = 0.0003 to 0.03) with each consecutive EUS examination. The interrater reliability between the sonologist and the expert for the presence of an abscess as measured by the kappa statistic was 0.80 (95% CI = 0.63 to 0.97), indicating substantial agreement.
Conclusions: After a brief training program, pediatric EPs can perform technically successful emergency EUS examination of SSTIs, with excellent agreement with an expert sonologist.
Bedside emergency ultrasound (EUS) is increasingly used as an adjunct to the clinical examination of emergency department (ED) patients. The Accreditation Council for Graduate Medical Education has made EUS a mandatory core competency in emergency medicine (EM) residency training.1 The American College of Emergency Physicians (ACEP) states that EUS requires that emergency physicians (EPs) become “knowledgeable in the indications for ultrasound, … competent in image acquisition and interpretation, and able to integrate the findings appropriately in the clinical management of patients.”2 According to a survey from 2003, 95% of EM residency programs incorporate EUS training as part of their curriculum.3 By comparison, pediatric training in EUS is less widespread. A survey of 46 pediatric emergency medicine (PEM) fellowship programs in 2006 indicated that 43% do not use EUS in their ED, and only 33% of PEM fellowships have a formal 2- to 4-week EUS rotation.4
ACEP recognizes that as a relatively new tool in EM, many practicing EPs may not have received residency training in the use of EUS. For this reason, the 2008 guidelines include EUS training recommendations for such physicians, including the endorsement of a single-day course format for physicians seeking to learn a single or multiple applications of EUS.2 However, despite the wide range of experts used to develop the consensus for the ACEP guidelines, there are very limited data regarding what comprises an adequate level of proficiency, how to assess it, and how much training and experience is needed to obtain it.
One application for EUS has been for the evaluation of skin and soft tissue infections (SSTIs) such as cellulitis and abscess, which are responsible for a large and increasing number of ED visits.5 Soft tissue EUS has been demonstrated to improve the accuracy of differentiating abscess from cellulitis in adult patients.6 At this time, in addition to the absence of educational guidelines focused on the needs of pediatric EPs and other EUS-naïve physicians learning EUS, there are, to our knowledge, no published studies of training curricula for soft tissue EUS for either adult or pediatric EPs. The objective of this study was to evaluate a training protocol for pediatric EPs performing EUS for SSTIs by assessing 1) their technical ability for performing soft tissue EUS and 2) the interrater reliability with an expert sonologist.
This was a study to assess an educational intervention as part of a larger prospective cohort study to evaluate emergency EUS use for SSTIs. The protocol received approval from the hospital’s institutional review board.
Study Setting and Population
The study was conducted at an urban tertiary care pediatric hospital with an annual ED census of approximately 90,000 patients. Enrollment took place from June 2008 through July 2009.
Seven PEM fellows and attending physicians served as study subjects for the educational intervention. Three had never used an EUS machine, three had brief experience using emergency EUS for the focused assessment with sonography for trauma (FAST) examination in adult trauma patients, and one had used EUS to assess bladder volume for catheterization.
Once the intervention was complete, the fellows and attendings performed EUS on patients between 2 months and 18 years old with a suspected SSTI. If a patient had more than one lesion at the time of the ED visit, up to three lesions were evaluated on an individual patient. Patients could only be recruited for the study once.
Study physicians completed a 6-hour training course (Table 1) developed and taught by an expert emergency sonologist with more than 10 years’ experience in the practice and teaching of EUS (AJD). The course consisted of lectures in a didactic interactive format, review of numerous video clips of normal and abnormal soft tissue EUS examinations, and hands-on scanning practice in which students were challenged to demonstrate optimized images of normal soft tissue anatomy. Before and after the course, a pre- and posttest was administered, respectively, which included questions assessing both cognitive learning and image interpretation.
|Pretest||15 minutes||Multiple choice/fill-in questions|
|Physics||30 minutes||Fundamentals of US wave physics Pulse-echo principle Angle of beam Acoustic impedance and tissue density Attenuation: absorption and scatter Transducer frequency: effect on resolution and penetration|
|Instrumentation||45 minutes||Knobology Gain/attenuation Power, depth, and magnification Image orientation Real-time clips and images to save Image display|
|Soft tissue US||1.5 hours||Normal skin and soft tissue anatomy Images of soft tissue pathology: cellulitis, abscess Pitfalls including lymph nodes, nerves, thrombus, foreign body|
|Proctored practice session||2 hours||Ultrasound examinations performed on healthy student volunteers Refinement of technique Pitfalls Tips for maximum utilization|
|Review||45 minutes||Review of video clips and cases|
|Posttest||15 minutes||Multiple choice/fill-in questions US scanning and identification of structures on volunteers|
|Bedside proctored examinations||Minimum of five supervised ultrasound examinations on ED patients|
Following the training, all study physicians were proctored at the bedside for a minimum of five consecutive bedside examinations on patients presenting to the ED with an SSTI. Senior-level EM residents and the study principal investigator, all of whom had satisfied the ACEP guidelines for residency-based EUS education,2 served as proctors for the bedside examinations. For each patient, the study physician performed a brief clinical examination followed by a standardized EUS examination under the direct observation of the proctor who was instructed to refrain from giving the student any input or guidance. When the examination was complete, the student recorded his or her interpretation of the examination on the data sheet, after which the proctor gave immediate feedback on the EUS examination. Before being certified to perform soft tissue EUS independently without supervision, study physicians were required to “pass” (i.e., receive a rating of optimal or adequate; scale: optimal, adequate, minimally acceptable, unacceptable) on all aspects of the examination, including choice of transducer, adjustment of gain and depth, scanning technique, and interpretation, for 80% of their examinations; i.e., the study sonologist was allowed to “fail” only one scan out of five scans. If the study physician “failed” two or more scans, he or she was required to perform another five examinations. This process was continued until the study physician attained an 80%“pass” rate. For both proctored and independent EUS examinations, study physicians immediately documented their diagnoses (definitely no abscess, probably no abscess, uncertain, probable abscess, definite abscess) on data collection sheets. In addition, they were asked to assess the presence of factors that may have limited their ability to perform or interpret the EUS examination, such as an uncooperative patient, the location of the lesion, or suspected pain from the examination. Overall, our training curriculum was comparable in length and design to other focused training programs described in the literature7–13 and was consistent with the length and format recommended by ACEP.2
Study physicians recorded video clips and still images (with caliper measurements if an abscess was identified) in two orthogonal planes on every lesion. These images were later reviewed by the expert emergency sonologist for technical measures as well as for diagnostic interpretation of the EUS (scale as above: definitely no abscess, probably no abscess, uncertain, probable abscess, definite abscess). The expert sonologist did not have access to clinical information about the patient except for the location of the lesion being scanned. In addition, the expert was blinded to the diagnostic impression of the study physician. We provided periodic feedback to the study physicians on their performance through video and image review sessions based on the expert’s comments and suggestions. Our methods of determining competency were developed in keeping with the ACEP policy statement on emergency EUS including “traditional testing, observation of bedside skills, and overreading of images by experienced sonologists.”2
For all sonographic studies, a SonoSite MicroMaxx EUS machine was used with 6- to 13-MHz and 5- to 10-MHz linear array transducers, and a 2- to 5-MHz curved array transducer (SonoSite Inc., Bothell, WA). Sonologists were encouraged through their training and feedback sessions to make informed choices regarding transducer selection and image optimization. However, the default settings were such that if the sonologist activated the linear array transducer at the commencement of the examination, the “superficial imaging” preset was automatically activated. If the general purpose 2- to 5-MHz curved array transducer was chosen, the “general abdominal” preset was activated.
Assessment of Technical Ability. An expert emergency sonologist (AJD) evaluated every EUS examination on the basis of four technical measures (Table 2), each rated as 1—unacceptable, 2—minimally acceptable, 3—adequate, or 4—optimal. This scale was selected by the expert sonologist as a reasonable and simple method for rating scans. Figures 1 through 3 contain sample images representative of technically optimal and unacceptable or minimally acceptable examinations as determined by the expert. Each EUS study received a composite technical score based on the mean of each of these measures. This composite score was based on which measures were applicable to the scan (e.g., if the imaging was of a cellulitis without evidence of a fluid collection, caliper settings would not be applicable).
|1. Gain settings|
|—Optimized with anechoic structures appearing black|
|2. Depth settings|
|—Adjusted to include superficial muscle fascia and/or sufficient to demonstrate posterior acoustic enhancement|
|3. Caliper settings (if applicable)|
|—Demonstration of three orthogonal dimensions measured in two planes|
|4. Systematic scanning through two orthogonal planes in real time|
Interrater Reliability. The expert sonologist’s interpretation of the lesion in the images as being an abscess was compared to that of the study physician to determine interrater reliability. We grouped ratings into clinically relevant categories such that a rating of “definitely no abscess” or “probably no abscess” were considered “no abscess,” and the ratings of “probable abscess” or “definite abscess” were considered “abscess.” If the physician was unsure of the diagnosis based on their examination, he or she designated a rating of “uncertain.”
We performed data analyses using STATA 10.0 (StataCorp LP, College Station, TX). The Wilcoxon matched-pairs signed-rank test was used to compare pre- and posttest results. We used linear regression with generalized estimating equations14 to determine the mean improvement in technical ability with each EUS performed. We specified an exchangeable correlation matrix and robust (Huber-White) standard errors.15,16 We tested the statistical significance of the regression coefficients using the Wald test. The internal consistency of our assessment of technical ability was assessed using Cronbach’s alpha. Cohen’s kappa statistic17 was used to analyze interrater reliability. The kappa statistic was interpreted as poor agreement (κ < 0.00), slight agreement (κ = 0.00 to 0.20), fair agreement (κ = 0.21 to 0.40), moderate agreement (κ = 0.41 to 0.60), substantial agreement (κ = 0.61 to 0.80), and near perfect agreement (κ = 0.81 to 1.00).18 To assess the robustness of our findings to within-patient correlation, we repeated the analysis randomly selecting one lesion per patient. Descriptive statistics including point estimates with 95% confidence intervals (CIs), medians with interquartile ranges (IQRs), and means with standard deviations (SDs) were used where appropriate. A level of significance was designated a priori to those p values of ≤0.05.
A total of seven physicians (four pediatric EM fellows, three pediatric EPs) underwent the training course. The median written posttest score (86%, IQR = 78%–92%) improved from the pretest score (54%, IQR = 50%–54%; p = 0.03). For the proctored training period, five of seven trained study physicians were certified after performing five examinations, and two required 10 proctored examinations. The remaining examinations analyzed were unproctored. We evaluated a total of 108 lesions in 95 patients during the study period. The mean (±SD) patient age was 7.6 (±6) years (range = 4 months–18 years). Twenty-eight percent of lesions were located on the leg, and 25% were on the buttock. One EUS examination was inadvertently not saved and was not available for expert review. Therefore, 107 examinations were evaluated for technical ability and interrater reliability.
Assessment of Technical Ability
Cronbach’s alpha as a measure of the internal consistency of our assessment of technical ability was 0.7. The mean (±SD) composite score for the first scan was 3.3 (±0.14) and improved by 0.015 (95% CI = 0.0003 to 0.03) with each additional scan performed. Table 3 summarizes the expert ratings for each technical measure as well as the composite score. There was no difference between the mean composite score during the proctored period and the unproctored period (difference = −0.11, 95% CI = −0.36 to 0.14). The mean composite score for EUS performed when the child was deemed cooperative by the study physician was 3.50 and decreased to 3.01 for those performed on uncooperative children (difference = −0.49, 95% CI = −0.19 to −0.79).
|Technical Measure||Mean for First Scan (95% CI)||Average Improvement With Each Examination (95% CI)||p-value for Test for Trend in Improvement|
|Systematic scanning (n = 105)*||3.1 (2.7–3.5)||0.023 (0.0005 to 0.05)||0.05|
|Depth (n = 107)||3.6 (3.4–3.9)||0.008 (–0.009 to 0.02)||0.38|
|Gain (n = 107)||3.5 (3.3–3.8)||0.005 (–0.01 to 0.02)||0.57|
|Caliper measurements (n = 94)†||3.4 (2.9–3.8)||–0.020 (–0.06 to 0.02)||0.38|
|All examination composite score (n = 107)||3.3 (3.0–3.5)||0.015 (0.0003 to 0.03)||0.05|
The interrater reliability for the diagnosis of abscess for examinations performed when both the study sonologist and the expert rated a lesion as either probable or definite (abscess or no abscess) was 0.80 (95% CI = 0.63 to 0.97), with a percentage agreement of 95% (95% CI = 88% to 98%). The overall kappa when examinations rated as “uncertain” were included was 0.56 (95% CI = 0.36 to 0.73), with a percentage agreement of 84% (95% CI = 76% to 90%). One EUS examination was deemed “inadequate” and therefore uninterpretable by the expert sonologist. To not overestimate the reliability, we assumed that the study physician and expert would not be in agreement for this lesion. Therefore, the examination, rated by the study physician as an abscess, was assumed to be rated as “uncertain” by the expert. Both analyses for interobserver reliability were calculated based on the data shown in Tables 4A and 4B.
|Study Physician Interpretation||Expert Interpretation|
|Study Physician Interpretation||Expert Interpretation|
There were five discordant examinations where the study sonologist and expert disagreed as to the presence or absence of an abscess (Table 4A). Two examinations involved the expert diagnosing the lesion as not being an abscess and the study sonologist diagnosing the lesion as an abscess. These were both “failed” proctored examinations in which the study sonologists incorrectly measured areas that were not true abscesses (normal soft tissue in one and a lymph node in the other). In the other three examinations, the study physicians did not identify a collection of fluid within an area of cellulitis that was deemed by the expert to be an abscess.
As study physicians performed more scans, the interrater reliability was stable over time, without clear improvement or worsening (p = 0.9). In addition, the interrater reliability during the proctored and unproctored periods was 0.74 and 0.84, respectively, further supporting the maintenance of skill.
When we randomly selected out one lesion per patient, our results for technical ability as well as interrater reliability were not substantively different, suggesting the independence of each lesion in the case of multiple lesions within a single patient.
Several studies have demonstrated the utility of EUS for pediatric patients in the ED.19–24 Therefore, it is important that pediatric EPs be appropriately trained in its use. Ours is the first study to evaluate emergency EUS training for pediatric EPs. Performing EUS requires three types of competency. First, cognitive learning is required to understand the physics, principles, applications, and clinical significance of ultrasonography. This kind of knowledge is usually imparted through traditional media such as lectures and textbooks. Second, visual pattern recognition is required. This can also be attained through books and lectures, but is primarily repetitive and nonverbal. The third competency is psychomotor and requires ability in three-dimensional conceptualization as well as dexterity in manipulating the probe and applying cognitive and visual knowledge for image optimization. Our study sought to measure the global ability of novice pediatric EPs with an EUS application that requires mastery and integration of all three competencies.
Overall, physicians demonstrated good technical ability in their acquisition of EUS images. The four technical measures (Table 2) evaluated were sufficient to measure technical ability as indicated by a Cronbach’s alpha of 0.7.25 While no one, to our knowledge, has studied the success of a brief didactic session and training period for pediatric soft tissue EUS, our findings regarding an educational intervention for a single application EUS are not novel. In one study evaluating the ability of medical students to perform right upper quadrant EUS examinations after a 1.5-hour curriculum, the authors measured the quality of EUS images obtained on a 0–4 scale.26 They found a similar mean image quality score of 3.0–4.0 among those with minimal to no prior EUS experience, corroborating our findings that, for a limited application, novices can easily and effectively learn EUS. Several other studies have similarly demonstrated the success of novice sonologists for performing EUS for a focused indication.7–10,12,27–31 There was a modest improvement in technical ability in our study with each additional examination performed following training. Although this improvement in technical ability was marginal, it is of importance to note the presence of an improvement, given that over time there can be potential for decay in ability the farther out the sonologist is from an educational intervention.32–35 Our study analyzed examinations performed up to 14 months following the initial training, and similarly, multiple studies have demonstrated skill retention up to 18 months following a focused EUS training course.12,36
Our data suggest that EUS for SSTI in uncooperative children is more technically challenging. Therefore, physicians may want to consider methods to promote cooperation, such as distracting techniques, child life specialists, if available, and in some cases, pharmacologic anxiolysis.
We found that the interrater reliability was high and comparable to a study evaluating the reliability of surgeons compared to radiologists for the FAST examination following a training course.27 Our level of reliability was further maintained beyond the training period, throughout the duration of the study period. To not overestimate the reliability, and to more closely simulate actual clinical conditions, we also assessed the reliability when we included examinations in which either the study physician or the expert were “uncertain.” Not surprisingly, this resulted in a lower agreement. Physician uncertainty may occur for several reasons. Among novices, it can arise from any of the common reasons for sonographic misdiagnosis: poor gain, depth, or frequency adjustments; inexperience with common pitfalls (such as lymph nodes, soft tissue thickening, tendons, muscles, or other soft tissue masses); or the inherent uncertainty engendered by relative lack of experience. With increasing experience, sonologists tend to overcome these impediments; however, the natural history of SSTIs, a proportion of which evolve from cellulitis to abscess, may still lead to uncertainty, since often there are no clear-cut criteria that separate these two. Other reasons for uncertainty may be purely technical impediments such as a pediatric patient who is unable to cooperate with the examination or an infection located in an inaccessible area such as the gluteal cleft or on an irregular bony area such as the hand that resists easy sonographic scanning. It is also possible that a clinician performing the EUS may be swayed into a sonographic diagnosis or “certainty” by the clinical findings, in contrast to a reviewer, whose judgment is based solely on the images.
Of note, this study did not seek to determine the accuracy of EUS or of the physicians studied in the diagnosis of SSTIs. As we wanted to assess the training curriculum, we chose to evaluate agreement between the study physician and the expert. It is possible that the study physicians were accurate at times when they disagreed with the expert, or that both physicians agreed, but were inaccurate in their diagnosis.
Currently, there is no standardized training program for EPs (pediatric or otherwise) to learn soft tissue emergency EUS. A study by Squire et al.,6 in which EPs evaluated the accuracy of EUS for diagnosing soft tissue infections, used a 30-minute didactic and hands-on training session for EM residents and faculty with experience in emergency EUS. Studies evaluating the training of EPs and surgeons in the FAST examination typically involve an 8- to 10-hour training course that includes didactic and hands-on sessions, followed by 15–25 proctored examinations.7,12,37,38 However, the FAST examination has more components, with more technically challenging sonographic windows, and a wider variety of potential pathological findings than the soft tissue EUS examination. Therefore, it seems likely that a simple focused examination such as EUS for SSTI would require a more limited training curriculum, such as the one used in our study. In fact, since only 4.25 hours of the training session were dedicated to soft-tissue training in our curriculum, it may be reasonable to assume that for those who already possess basic knowledge and experience with EUS, a shortened training day may be sufficient to learn soft tissue EUS and other examinations as new applications.
A wide range of physicians who do not currently have EUS experience, including internists, family physicians, and pediatricians, all of whom are managing increasing numbers of soft tissue infections, potentially could use EUS in their practices. Our study demonstrates that EUS for this indication is a skill that is easily learned. Furthermore, we believe that by enabling novice sonologists to attain a level of 95% agreement with an expert sonologist, this curriculum provides students with the training needed to become competent in soft tissue EUS.
Our technical measurements were on a four-point scale that has been used in the past,11,26,28 but may represent too narrow a scale to detect more variation in technical ability. Furthermore, since the criteria for grading the images in each of the four technical measures were not defined, it is possible that the expert applied them in an inconsistent or haphazard way. It is possible that the modest improvement in technical ability may have been the result of a ceiling effect seen immediately following the training course, with little room for improved skill, as the mean composite score was initially high (>3 of 4). In addition, for our analysis we grouped the five possible ratings of presence/absence of abscess available to both the study and expert sonologist into three clinically relevant categories (“no abscess,”“abscess,” and “uncertain”). This grouping may have lost some subtle differences indicated by the practitioners; however, it captures the three key diagnostic categories that underpin subsequent management decisions.
Our expert did not have access to clinical information, which could affect interpretation. Had our expert sonologist had the same opportunity to perform a physical examination as the study sonologists, our overall measure of interrater reliability may have been higher. In addition, the expert’s interpretation was based entirely on the examinations recorded by the study physician and not on an independent EUS performed by the expert. If the depth was not set correctly, or if the study physician did not scan in the appropriate location, for example, this would have biased our results toward agreement in cases when the study physician felt there was no abscess. However, the soft tissue EUS is performed on a limited area of clinical interest and we required video clips in two planes, so as to minimize this bias. Despite the fact that the expert was blinded to the study physician’s interpretation, complete blinding was not possible, as the expert was reviewing still images with caliper settings in the case of an abscess identified by the study physician and images without calipers in the case of no abscess identified by the study physician. Although this would bias our results toward higher agreement, there were several cases when the expert disagreed with the presence of an abscess as defined by calipers, and conversely, our expert identified areas of fluid collection on video clips that were unidentified by the study physician.
During the course of the study, on two occasions, study physicians reviewed selected video clips and images and received feedback from the expert sonologist. It is possible that examinations performed soon after each of these sessions were technically better and more reliable than those performed farther out. We also included both proctored and unproctored examinations within our analysis, leading to the possibility that the proctored examinations were unduly influenced by the experience of the proctor. As noted under Methods, we made every attempt to avoid this by instructing proctors to passively observe rather than instruct during proctored examinations, and students were required to complete their data sheets prior to receiving feedback from the proctor. However, it is possible that in some cases proctors became actively involved in proctored examinations. To the extent that this occurred, the results indicate there were no significant differences between the mean composite score or the interrater reliability during proctored and unproctored periods. This suggests that students, after the completion of their proctored period, were at least able to perform as well as they did under the influence of their proctors.
We chose to study a small number of physicians in our study, as this was our first experience with EUS. We therefore wanted to maintain quality control and felt that a smaller group of study physicians could be more tightly regulated. Had we evaluated a larger number of study physicians, our results may have been different. In addition, the small number of physicians did not allow for the evaluation of physician-level factors, such as time since residency training or level of clinical experience, ability to learn three-dimensional concepts, openness to learning new skills, and aptitude in hand–eye coordination, which might affect technical ability and reliability. As our study involved seven pediatric EPs who practice in a tertiary care setting, our results may not be generalizable to other settings.
After a 1-day didactic and practical training course, followed by a short period of supervision, novice pediatric EPs can develop technical proficiency in performing soft tissue emergency EUS and demonstrate excellent agreement in their interpretation of soft tissue emergency EUS findings when compared to an expert emergency sonologist.
The authors thank Jeremy Kahn, MD, MSc, for his statistical advice, and Martin Pusic, MD, and Frances Nadel, MD, MSCE, for their thoughtful review of the manuscript.
- 1Accreditation Council for Graduate Medical Education. ACGME Emergency Medicine Guidelines. Available at: http://www.acgme.org/acWebsite/RRC_110/110_guidelines.asp#plan. Accessed Nov 9, 2010.
- 2American College of Emergency Physicians. ACEP emergency ultrasound policy statement, 2008. Available at: http://www.acep.org/WorkArea/DownloadAsset.aspx?id=32878. Accessed Nov 9, 2010.
- 15The behavior of maximum likelihood estimates under nonstandard conditions. In: Proceedings of the Fifth Berkeley Symposium on Mathematical Statistics and Probability. Berkeley, CA: University of California Press, 1967, pp 221–33..
- 25Psychometric Theory, 3rd ed. New York, NY: McGraw Hill, 1994., .
- 36Assessment of knowledge retention and the value of proctored ultrasound exams after the introduction of an emergency ultrasound curriculum. BMC Med Educ. 2007; 7:e40., , , , .