SEARCH

SEARCH BY CITATION

Abstract

  1. Top of page
  2. Abstract
  3. Methods
  4. Results
  5. Discussion
  6. Limitations
  7. Conclusions
  8. Acknowledgments
  9. References
  10. Supporting Information

ACADEMIC EMERGENCY MEDICINE 2011; 18:1177–1185 © 2011 by the Society for Academic Emergency Medicine

Abstract

Objectives:  The objective of this study was to assess the association between the performance of practicing paramedics on a validated cognitive exam and their field performance, assessed on a simulated emergency medical services (EMS) response.

Methods:  This was an observational study of paramedics from a single-tiered, urban, advanced life support EMS agency. A high-fidelity simulated response to a medical emergency on environmentally realistic sound stages, and the cognitive portion of the national paramedic certification exam, were each assessed as pass or fail. Participants were randomly assigned to one of six simulations designed by the agency’s educational staff, medical director, and representatives from the National Registry of EMTs to be equivalently difficult. Simulations were pilot tested to assess content and face validity. Each participant was classified as failing a simulation scenario if his or her score was one standard deviation (SD) below the population mean.

Results:  There were 107 paramedics who participated in the study. Participants reported a median of 7.7 years of service (interquartile range [IQR] = 4.1 to 12.8 years). Simulation scores were normally distributed. Ninety-two (86.0%) participants received a passing score for the simulation and 77 (72.0%) passed the cognitive exam. There were 70 (65.4%) individuals who passed both the simulation and the cognitive exam, eight (7.5%) who failed both the simulation and the cognitive exam, 22 (20.6%) who passed the simulation but failed the cognitive exam, and seven (6.5%) who failed the simulation but passed the cognitive exam. There was a significant association between passing the cognitive exam and passing the simulation (chi-square p-value = 0.02).

Conclusions:  This study simultaneously assessed cognitive knowledge and simulated field performance. Utilization of these measurement techniques allowed for the assessment and comparison of field performance and cognitive knowledge. Results demonstrated an association between a practicing paramedic’s performance on a cognitive examination and field performance, assessed by a simulated EMS response.

Each year emergency medical services (EMS) personnel transport millions of patients to hospitals throughout the United States.1 EMS professionals perform a multitude of urgent and clinically invasive interventions while caring for a diverse patient population.2,3 It is expected that competent, safe, and effective care be delivered by these professionals 24 hours a day, 7 days a week, throughout their career. Therefore, understanding the dynamics of continued competency among EMS professionals is an important undertaking. However, the assessment of continued competency within the field of medicine is a difficult task.4

The concept of competency includes many domains, including psychomotor, cognitive, and affective. Research regarding educational factors and personal characteristics that affect entry level competency within EMS has been conducted.5–9 However, there is very limited research assessing the continued competency of EMS professionals. A majority of the continued competency literature focuses on skill degradation, knowledge retention, and educational interventions.10–14 Research that assesses the cognitive domain of continued competence within EMS is scant.15 Most importantly, there are no known studies that assess the complex interplay between cognitive competency and paramedic field performance.

Defining the association between cognitive competency and performance extends into many professions, but is especially important in medicine where incompetent care may adversely affect patient outcome. The assurance of continued competency is particularly relevant to EMS professionals due to the autonomous nature of their work. While every EMS professional must have medical oversight from a physician, there is typically little oversight on the scene of an emergency. Unfortunately, universal consensus does not exist regarding the definition of a well-rounded practitioner, an individual who is adroit with respect to rapid diagnosis and clinical care. Within the EMS community, the question exists of whether cognitive competency, demonstrated through standard testing methods, is correlated with paramedic field performance. The assessment of continued competence through the observation of field performance is impractical due to the extraordinary time commitment and resource utilization that would be needed to observe individuals in enough settings to render an adequate judgment. However, simulation provides for reliable and repeatable methods for measuring competency and may serve as a surrogate measure for field performance.16 Recently, high-fidelity simulation using scenario-based cases has been introduced into the medical field to assess critical thinking skills in unison with performance.17 Comparing the results of an individual’s performance on a simulated patient encounter with his or her level of cognitive competence may further elucidate the concept of professional competence within EMS. The objective of this study was to determine if there was an association between a practicing paramedic’s performance on a validated cognitive examination and his or her field performance as assessed on a simulated EMS response.

Methods

  1. Top of page
  2. Abstract
  3. Methods
  4. Results
  5. Discussion
  6. Limitations
  7. Conclusions
  8. Acknowledgments
  9. References
  10. Supporting Information

Study Design

This was an observational educational study of currently certified paramedics working for the Mecklenburg EMS Agency in Charlotte, North Carolina, and was conducted between January 25, 2010, and March 2, 2010. This study received institutional review board approval from Carolinas Medical Center. All participants completed an informed consent form and participation was entirely voluntary, with no penalty for those who chose not to participate.

Study Setting and Population

The EMS agency serves a resident population of over 800,000 and a working population of approximately 1 million people in and around the city of Charlotte, North Carolina. This requires a peak deployment of 45 ambulances and results in an average yearly volume of over 94,000 calls, resulting in approximately 70,000 transports per year. All ambulances are advanced life support capable and staffed with at least one paramedic. At the time of the study, there were approximately 350 certified providers employed by the agency.

All personnel at the agency associated with EMS operations must maintain certification through the North Carolina Office of EMS (NCOEMS). Currently, to obtain certification candidates must complete an NCOEMS-approved educational program and complete the state certification exam. The North Carolina exam uses a written multiple-choice test, and a series of practical exercises to demonstrate skill proficiency. EMS professionals practicing in North Carolina are not required to pass the national certification examination to demonstrate entry-level competence.

To maintain their certification, all personnel must comply with current NCOEMS recertification guidelines. Currently the state of North Carolina requires a minimum of 24 hours of continuing education per year, with recertification every 4 years. Continuing education is provided to all credentialed employees through the agency’s Emergency Medical Education and Simulation Center. To fulfill continuing education requirements, employees receive monthly didactic education. In addition to lectures, scenario-based simulations have been used as a part of routine continuing education since 2007.

Simulation is performed using state of the art, high-fidelity human patient simulators (METI ECS, Sarasota, FL) on environmentally realistic sound stages. This allows crews the ability to perform as if they were on the scene of a real EMS response. Evaluators observe performances via live audio and video feed from a separate room. Paramedics are instructed to converse with the high-fidelity simulator to elicit any pertinent historical information as an actor’s voice is transmitted through the mouth of the simulator. Other historical information, such as current medications, is placed in the patient room for the paramedics to find as directed by the patient. The simulators used in this study are also capable of blinking their eyes and dilating or constricting their pupils; they have palpable pulses and audible breath sounds and can simulate seizure activity by shaking.

At the time of its inception, the Emergency Medical Education and Simulation Center was the first of its kind to use this method of simulation in the prehospital setting. Following each simulation, a debriefing session is conducted. These debriefings are facilitated by a member of the training staff who has received instruction in simulation debriefing techniques. Digital video recording of the simulations are reviewed to illustrate both positive and negative aspects of performances and to facilitate learning.

Study Protocol

This study coincided with a scheduled, biannual continuing education cycle involving simulation. Approximately 2 months before starting the simulation cycle, all agency paramedics were informed about an upcoming research project investigating cognitive knowledge and performance on a simulated EMS response. Paramedics were asked to indicate their interest in participating when they scheduled a time for their mandatory simulation. A stipend of $100 was offered as an incentive to participate in the study. To be eligible to participate, individuals must have possessed North Carolina state paramedic certification; be employed as a paramedic within Mecklenburg County, North Carolina; and agree to direct and lead all patient care activities during the simulation.

During simulation, participants performed a simulated EMS response in crews of two or three. Prior to starting the simulation, paramedics who did not previously indicate an interest in the project were again asked if they would like to participate. Paramedics willing to participate were reminded that they must be the lead paramedic during the simulation. Paramedics not participating in the research project were asked to perform as a competent partner during the simulation. Partners were instructed to avoid the temptation to take over the lead if they witnessed a poor performance.

Following the simulation, participants completed the cognitive portion of the national paramedic certification exam. Participants also provided basic demographic information on a standardized questionnaire. Study participation concluded once an individual completed the cognitive examination.

Measurements

This study used two measurement tools to assess cognitive competency and field performance: the cognitive portion of the national paramedic certification exam distributed by the National Registry of EMTs (NREMT), and a simulated EMS response. All crews participating in the scheduled simulation cycle, regardless of research participation, were randomly assigned to one of six simulations. To control for potential unrecognized bias, block randomization was used to assign paramedics to simulation scenarios. There were 126 potential time periods in which a paramedic could register for and complete a simulation, and these were divided into 21 equal blocks of six, with simulation scenarios randomized within each block. Paramedics were assigned to the next available scenario on the schedule based on the time period they requested when registering, with the sequence of scenarios in blocks of six being randomized previously. The randomization scheme was not revealed to the individuals scheduling paramedics for their simulation. Randomization was revealed to the simulation operators 1 day prior to the next round of simulations, to allow for appropriate set-up.

Simulations were designed during a 2-day preproject planning conference by the study investigators and a committee consisting of agency’s educational staff, medical directors, and representatives from NREMT. Scenarios were constructed so that they would be equivalently difficult and discriminate differing skill levels. Table 1 provides a brief description of each scenario as well as the possible point totals.

Table 1.    Description of Scenarios Presented During Simulation
Scenario    Patient Description     Expected TreatmentTotal Possible Points
  1. ECG = electrocardiogram; MI = myocardial infarction; MVC = motor vehicle crash; LBBB = left branch bundle block; PVC = premature ventricular contraction; SVT = superventricular tachycardia.

Abdominal pain/MI42-year-old patient involved in minor MVC 2 hours prior to call. Aching abdominal pain, history of cardiac risk factors.Rapid assessment, rule out trauma to abdomen, rule in high potential for MI based on PVCs and LBBB on 12-lead ECG.48
Diabetic seizure19-year-old male patient, with seizure history. No diabetes history. Called for weakness following seizure. Blood glucose level is low for him. Patient accidently took diabetic medication.Rapid assessment, recognition of incorrect medication, treatment of hypoglycemia.51
Hyperkalemia64-year-old male patient, extensive history (including end-stage renal failure/dialysis), profound weakness.Rapid assessment, recognition of ECG changes and presentation consistent with hyperkalemia, treatment of hyperkalemia.50
Pediatric asthma8-year-old male patient, asthma attack at school while playing football. Following treatment of asthma, patient develops SVT.Rapid assessment, recognition and treatment of asthma. Differentiation of SVT vs. anxiety/albuterol effect. Treatment of SVT.48
Postpartum seizure28-year-old female patient, 1 week postpartum with severe headache and elevated blood pressure, seizure develops while on scene.Rapid assessment, recognition of preeclampsia. Treatment of eclampsia during or after seizure.54
SepsisElderly patient with extensive history, weak, fever, indwelling Foley catheter.Rapid assessment; correctly identify sepsis as diagnosis, appropriately treat patient based on presentation.52

Scoring of scenarios was conducted by two agency education and quality specialists who were trained at the paramedic level and have received additional training regarding conducting and evaluating medical simulation. To facilitate scoring of scenarios, checklists were created during the simulation development process (see Data Supplement S1, available as supporting information in the online version of this paper). Discrete tasks were awarded single points while tasks requiring multiple steps were awarded points for successful completion of each step. However, if an individual incorrectly completed a critical step, further points for that multistep task were not awarded. There were no preidentified critical failures that would have led to scenario failure regardless of the final score.

All simulations were pilot-tested by two separate crews 1 week prior to starting implementation. No material changes to the scenarios were made based on the results of those pilot tests, and the scoring of the simulations appeared to have face validity. The crews performing the pilot testing were excluded from study participation.

Participants’ cognitive knowledge was assessed using the cognitive portion of national paramedic certification examination. The NREMT cognitive examinations have been used in past research to assess the continued cognitive competency of EMTs.15 Currently, nationally registered EMS professionals can also recertify by taking this adaptive exam in lieu of documented hours of continuing education.18 At the time of this study, the cognitive timed computer-adaptive examination consisted of 80 to 150 scored items from six subsections (airway management, cardiology, medical, operations, obstetrics/pediatrics, and trauma). The adaptive item-delivery algorithm matches the difficulty level of the item to the ability level of the participant. The examination is scored using item response theory, which produces a measure of cognitive ability called a theta score.

The primary outcome variable for the cognitive examination was whether or not the participant received a passing score. At the time of this study individuals must have earned a theta score of at least 1.18 to pass the national paramedic certification exam. A minimum passing score allows the assumption that the individual possessed, at the time of the examination, the minimum core knowledge expected of an entry-level paramedic. Performance on the cognitive examination was then compared to performance on the simulated EMS response. Performance on the simulated EMS response was reported as the percentage of the total possible points achieved. A consensus review using developed checklists was conducted by two agency education and quality specialists during and immediately after the simulation. Discrepancies between the two reviewers were arbitrated by the principal investigator using tape review if necessary. Participants were deemed to have failed the simulation if their score was more than one standard deviation (SD) below the mean simulation score for the study population. Individuals were further classified as passing both assessments, failing both assessments, or passing one assessment and failing the other.

Other independent variables collected from each participant, included race, sex, years of service at the agency (analyzed as quartiles), initial paramedic training institution (Mecklenburg EMS or other), any certification as an EMS instructor (yes or no), and job description, categorized as paramedic, crew chief, field training officer (FTO), or supervisor.

Data Analysis

Study population characteristics were analyzed using descriptive statistics, including frequencies, means, 95% confidence intervals (CIs), and SD, as well as medians and interquartile ranges (IQRs). Simulation and cognitive performance were analyzed as both dichotomous and continuous variables. Inferential statistics were calculated on these data for exploratory purposes. Associations between simulation, cognitive performance, and population characteristics were assessed using chi-square tests or Fisher’s exact test for categorical data where appropriate and unpaired t-tests or Mann-Whitney test depending on normality for continuous data. Finally, a comparison of simulation and cognitive exam performance was conducted based on whether individuals passed or failed either assessment followed by a correlational analysis performed on the continuous measures for each assessment technique. All data analysis was conducted using Stata v10.1 (StataCorp, College Station, TX)

Results

  1. Top of page
  2. Abstract
  3. Methods
  4. Results
  5. Discussion
  6. Limitations
  7. Conclusions
  8. Acknowledgments
  9. References
  10. Supporting Information

During the study period there were 142 paramedics employed at the Mecklenburg EMS Agency, with 113 (79.5%) agreeing to participate. Six individuals were excluded from analysis due to missing or incomplete data, and a complete case analysis was performed on 107 paramedics (75.4% of the system’s population of paramedics). The mean (±SD) age of the study population was 35.6 (±7.4) years. The majority of participants were male (77.6%) and white (87.9%). Participants reported a median of 7.7 years of service (IQR = 4.1 to 12.8 years). Years of service were categorized into quartiles and are displayed in Table 2 along with other population characteristics.

Table 2.    Characteristics of the Study Population
VariableN%
  1. FTO = field training officer.

Study population107100
Sex
 Male8377.6
 Female2422.4
Race
 White9487.9
 Minority1312.1
Years of service
 Less than 42624.3
 4 to 7.52725.2
 > 7.5 to 133028.1
 More than 132422.4
Main job description10598.1
 Field paramedic2927.6
 Crew chief 4643.8
 FTO 2019.1
 Supervisor109.5
EMS instructor certification
 No9992.5
 Yes87.5
Initial EMS training institution
 Mecklenburg EMS3129.0
 Other7671.0

There were 77 (72%) individuals who received a passing score on the cognitive examination. The mean (±SD) theta score for all participants was 1.57 (±0.59). Figure 1 displays the mean theta scores and 95% CIs by collected population characteristics. The mean simulation performance score for all participants over all scenarios was 62.7% (95% CI = 60.7% to 64.6%; SD ± 0.10). Figure 2 displays the overall distribution of simulation scores. Figure 3 displays the mean simulation scores and 95% CIs by scenario. The only statistically significant difference noted when comparing the mean simulation scores was for the comparison of the pediatric asthma scenario and the sepsis scenario (p < 0.01). Overall, 92 (86.0%) participants received a score above the predetermined passing standard on the simulated EMS response. Figure 4 displays the mean simulation scores and 95% CIs by collected population charac-teristics.

image

Figure 1.  Mean national paramedic certification exam theta scores and 95% CIs by collected population characteristics. FTO = field training officer.

Download figure to PowerPoint

image

Figure 2.  Overall frequency of simulation scores.

Download figure to PowerPoint

image

Figure 3.  Mean simulation scores and 95% CIs by scenario. *Difference in mean scores statistically significant (chi-square p < 0.01). MI = myocardial infarction.

Download figure to PowerPoint

image

Figure 4.  Mean simulation performance scores and 95% CIs by collected population characteristics. FTO = field training officer.

Download figure to PowerPoint

Comparison of Simulation and Cognitive Exam Performance

There were 70 (65.4%) individuals who passed both the simulation and the cognitive exam. Only 8 (7.5%) individuals failed both, while 22 (20.6%) passed the simulation but failed the cognitive exam, and seven (6.5%) failed the simulation but passed the cognitive exam. When comparing cognitive exam results to simulation performance, there was a significant association between passing the cognitive exam and passing the simulation (p = 0.02). Table 3 displays performance on both the simulation and the NREMT exam by collected population characteristics. Age was also assessed with respect to performance on both the simulation and the cognitive exam, and there was not a significant association.

Table 3.    Performance on Both the Simulation and the NREMT Exam by Collected Population Characteristics
Population CharacteristicsPassed Both Exams, n = 70 (65.4%)Failed Both Exams, n = 8 (7.5%)Passed Cognitive Exam Only, n = 7 (6.5%)Passed Simulation Only, n = 22 (20.6%)
  1. Results are reports as n (%)

  2. FTO = field training officer; NREMT = National Registry of Emergency Medical Technicians.

  3. *Fishers exact p ≤ 0.05.

Sex*
 Female 15 (62.5)1 (4.2)1 (4.2)7 (29.1)
 Male 55 (66.3)7 (8.4)6 (7.2)15 (18.1)
Race
 Minority 5 (38.4)2 (15.4)3 (23.1)3 (23.1)
 White65 (69.1)6 (6.4)4 (4.3)19 (20.2)
Years of experience
 <4 15 (57.7)2 (7.7)4 (15.4)5 (19.2)
  4 to 7.5 22 (81.5)0 (0)0 (0)5 (18.5)
 >7.5 to 13 19 (63.4)3 (10.0)1 (3.3)7 (23.3)
 >13 14 (58.4)3 (12.5)2 (8.3)5 (20.8)
Main job description
 Paramedic 19 (65.5)1 (3.5)3 (10.3)6 (20.7)
 Crew chief 30 (65.2)4 (8.7)3 (6.5)9 (19.6)
 FTO 14 (70.0)1 (5.0)0 (0)5 (25.0)
 Supervisor 6 (60.0)2 (20.0)1 (10.0)1 (10.0)
EMS instructor
 No 64 (64.6)7 (7.1)7 (7.1)21 (21.2)
 Yes 6 (75.0)1 (12.5)0 (0)1 (12.5)
Initial paramedic training program
 Mecklenburg EMS 20 (64.5)0 (0)3 (9.7)8 (25.8)
 Other 50 (65.8)8 (10.5)4 (5.3)14 (18.4)

When comparing the continuous measures of the cognitive exam and simulation scores, the correlation noted was technically significant using the standard unadjusted Type I error rate (Spearman’s rho = 0.18, p = 0.053). However, assessments of correlation are greatly influenced by outliers.19 There were three outliers, two who performed well on the cognitive exam but poorly on the simulation, and one who performed well on the simulation but poorly on the cognitive exam (Figure 5). When these outliers were removed, a significant correlation was noted when assessing the overall cognitive exam score and the score on the simulation (Spearman’s rho = 0.21, p = 0.03). However, it is important to note that the Spearman’s correlation with outliers (0.18) and excluding outliers (0.21) were close in magnitude.

image

Figure 5.  Correlation between cognitive exam scores and simulation scores with highlighted outliers. Correlation analysis performed using Spearman’s rho.

Download figure to PowerPoint

Discussion

  1. Top of page
  2. Abstract
  3. Methods
  4. Results
  5. Discussion
  6. Limitations
  7. Conclusions
  8. Acknowledgments
  9. References
  10. Supporting Information

To the best of our knowledge, this is the first study within the EMS profession that has simultaneously assessed cognitive competency and simulated field performance using high-fidelity simulation and a validated cognitive exam. Results from this study indicate that there is an association between a paramedic’s performance on a simulated EMS response and his or her performance on a cognitive exam. Individuals who performed well on the cognitive exam were significantly more likely to perform well on the simulation assessment. These results lend credence to the argument that cognitive knowledge translates to field performance and vice versa. However, it is possible that these results indicated that individuals who are “good test takers” perform equally well on written and simulated examinations.

Simulation has become a standard in the assessment of entry level competency in medical education.20 It has also been shown to be useful in teaching new skills to experienced providers or for refresher training on skills that are not often used. The simulations used in this study asked paramedics to perform as they would in the field and assessed each individual’s ability to reason through a problem while being provided real-time information. When using simulation in this respect, the observation of patient assessment and treatment may allow for the assessment of both paramedic field performance and cognitive competency. This may partially explain the association between cognitive performance and simulated field performance.

While simulation may be an effective means for assessing two domains of competency simultaneously, it may not yet be efficient, especially in EMS. While high-fidelity simulation centers have been constructed in a handful of EMS systems, they are not currently available to the majority of EMS systems and training institutions in the country.

While the use of simulation for assessing competency may provide many benefits, it remains a much more complex task than administering a validated cognitive examination. Although the actual process of creating a valid cognitive exam is expensive and time-consuming, it is a process that does not have to be undertaken by individual agencies or even individual states. As an example, this study used the paramedic cognitive exam developed by NREMT. While NREMT expends significant resources to develop this exam, individuals or agencies spend a fraction of that cost to take the exam and obtain results regarding cognitive competence. Cognitive testing does have an associated cost, but that financial burden can be spread over a greater number of users and is less dramatic than that of building a simulation infrastructure. The results of this study suggest that success on a valid and reliable EMS certification exam correlates with a passing score on a single simulated patient encounter. Further study is required to identify the predictive ability of a certification exam to simulated patient care and from simulated care to actual, competent field performance.

Individually, assessments of competency using simulation or cognitive testing have limitations that make it difficult to obtain or interpret an answer to the complex continued competency question. However, the weaknesses of each design may be overcome by using a multiple assessment technique.16 Multiple assessments may also be beneficial in increasing the confidence that an individual’s performance is indicative of competence.16 In this study there was an association between performance on the simulation and the cognitive exam, indicating that those who do well on one should do well on the other. With multiple assessments, we have the benefit of identifying individuals who did well on both assessments or did poorly on both. This information may increase the confidence in educators that those who did poorly on both assessments truly need remediation, while those who did well may benefit from more challenging work.

Investigating the continued competence of EMS professionals, and paramedics in particular, is a relatively new field with limited research results. Assessing the continued competence of these practicing health care providers is necessary for ensuring the safety of the public. Determining the most appropriate and efficient means to measure continued competency will require a multitude of differing study designs and an investment in this type of research by the EMS community. This study highlights the ability of two different types of measurement, simulation and cognitive examination, to provide insight into the continued competency of EMS professionals. Future research should focus on assessing more diverse and larger groups, replicating simulation methods in other EMS systems, and further assessing the utility of using the NREMT paramedic certification examination as a valid assessment for continued cognitive competency.

Limitations

  1. Top of page
  2. Abstract
  3. Methods
  4. Results
  5. Discussion
  6. Limitations
  7. Conclusions
  8. Acknowledgments
  9. References
  10. Supporting Information

This study has several limitations that may threaten the generalizability of these results to other EMS systems. Paramedics included in this analysis worked for an EMS agency that has used a simulation center for 4 years as part of its annual continuing education. These paramedics may have performed better on the simulated EMS response than equally trained paramedics who have never had the experience of training in a simulator. Conversely, the cognitive examination was administered in a low-stakes testing environment, which may have led to underperformance on this evaluation tool.

While the simulated calls were all developed to be equivalently difficult, the nature of the scenarios did not include treatment of low-frequency, high-acuity patients such as cardiac arrest or ST-segment elevation myocardial infarction care. So while individuals were assessed equally across six different scenarios, the scenarios were not the most difficult cases possible.

Measurement error during the simulation assessment may be another source of error. This study analyzed the performance of the lead paramedic self-identified prior to beginning the simulation. While every effort was made to assess only the performance of the lead paramedic, simulation was conducted in realistic crews of two or three. The educators scoring the simulation made judgments regarding whether critical components of each scenario were accomplished by the lead paramedic or if they were accomplished only because a secondary crew member completed the task without receiving direction from the lead paramedic. Secondary crew members were asked to perform as they would in a subordinate role in the field, and lead paramedics were not penalized if they delegated critical tasks to other crew members. While it may be ideal to assess paramedics individually, the reality is that EMS requires teamwork and interpersonal interactions. Therefore, while some measurement error may have occurred, assessing individuals as part of a team is more realistic.

Another source of measurement error may have arisen from the fact that we correlated a simulation-based test of a single pathophysiologic content area to a comprehensive cognitive examination. This correlation may have been affected by the possibility that participants had adequate general knowledge but a deficit in the specific disease tested using simulation or a working knowledge of patient care protocols but inadequate general knowledge.

Conclusions

  1. Top of page
  2. Abstract
  3. Methods
  4. Results
  5. Discussion
  6. Limitations
  7. Conclusions
  8. Acknowledgments
  9. References
  10. Supporting Information

This study of paramedics from an urban EMS agency indicated that there was an association between a practicing paramedic’s performance on a cognitive examination and field performance assessed by a simulated EMS response. Such assessments using currently practicing paramedics provide valuable information for stakeholders interested in tailoring EMS education to meet the needs of their workforce.

Acknowledgments

  1. Top of page
  2. Abstract
  3. Methods
  4. Results
  5. Discussion
  6. Limitations
  7. Conclusions
  8. Acknowledgments
  9. References
  10. Supporting Information

The authors thank the men and women of the Mecklenburg EMS Agency for their continued participation in prehospital research.

References

  1. Top of page
  2. Abstract
  3. Methods
  4. Results
  5. Discussion
  6. Limitations
  7. Conclusions
  8. Acknowledgments
  9. References
  10. Supporting Information
  • 1
    Lindstrom A. JEMS 2006 Platinum Resource Guide: key statistics and contact information. J Emerg Med Serv. 2006; 31:4256.
  • 2
    National Highway Traffic and Safety Administration. Emergency Medical Services Agenda for the Future. Washington DC: Department of Transportation, 1996.
  • 3
    National Registry of EMTs. 2004 National EMS Practice Analysis. Columbus, OH: NREMT, 2005.
  • 4
    Epstein RM, Hundert EM. Defining and assessing professional competence. JAMA. 2002; 287:22635.
  • 5
    Dickison P, Hostler D, Platt TE, Wang HE. Program accreditation effect on paramedic credentialing examination success rate. Prehosp Emerg Care. 2006; 10:2248.
  • 6
    Fernandez AR, Studnek JR, Cone DC. The association between emergency medical technician-basic (EMT-B) exam score, length of EMT-B certification, and success on the national paramedic certification exam. Acad Emerg Med. 2009; 16:8816.
  • 7
    Fernandez AR, Studnek JR, Margolis GS. Estimating the probability of passing the national paramedic certification examination. Acad Emerg Med. 2008; 15:25864.
  • 8
    Margolis GS, Romero GA, Fernandez AR, Studnek JR. Strategies of high-performing paramedic educational programs. Prehosp Emerg Care. 2009; 13:50511.
  • 9
    Margolis GS, Studnek JR, Fernandez AR, Mistovich J. Strategies of high-performing EMT-Basic educational programs. Prehosp Emerg Care. 2008; 12:20611.
  • 10
    Skelton MB, McSwain NE. A study of cognitive and technical skill deterioration among trained paramedics. J Am Acad Emerg Phys. 1977; 6: 4368.
  • 11
    Latman NS, Wooley K. Knowledge and skill retention of emergency care attendants, EMT-As, and EMT-Ps. Ann Emerg Med. 1980; 9:1839.
  • 12
    De Lorenzo RA, Abbott CA. Effectiveness of an adult-learning, self-directed model compared with traditional lecture-based teaching methods in out-of-hospital training. Acad Emerg Med. 2004; 11:337.
  • 13
    Bray JE, Martin J, Cooper G, Barger B, Bernard S, Bladin C. An interventional study to improve paramedic diagnosis of stroke. Prehosp Emerg Care. 2005; 9:297302.
  • 14
    French SC, Salama NP, Baqai S, Raslavicus S, Ramaker J, Chan SB. Effects of an educational intervention on prehospital pain management. Prehosp Emerg Care. 2006; 10:716.
  • 15
    Studnek JR, Fernandez AR, Margolis GS. Assessing continued cognitive competence among rural emergency medical technicians. Prehosp Emerg Care. 2009; 13:35763.
  • 16
    Kane MT. The assessment of professional competence. Eval Health Prof. 1992; 15:16382.
  • 17
    Kneebone R, Nestel D, Yadollahi F, et al. Assessing procedural skills in context: exploring the feasibility of an Integrated Procedural Performance Instrument (IPPI). Med Educ. 2006; 40:110514.
  • 18
    National Registry of EMTs. Paramedic Re-registration Requirements. Columbus, OH: National Registry of EMTs, 2009.
  • 19
    Kutner MH, Nachtsheim CJ, Neter J. Applied Linear Regression Models, 4th ed. New York: McGraw-Hill, 2004.
  • 20
    Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach. 2005; 27:1028.

Supporting Information

  1. Top of page
  2. Abstract
  3. Methods
  4. Results
  5. Discussion
  6. Limitations
  7. Conclusions
  8. Acknowledgments
  9. References
  10. Supporting Information

Data Supplement S1. Abdominal pain/MI.

The document is in PDF format.

FilenameFormatSizeDescription
ACEM_1208_sm_DataSupplementS1.pdf127KSupporting info item

Please note: Wiley Blackwell is not responsible for the content or functionality of any supporting information supplied by the authors. Any queries (other than missing content) should be directed to the corresponding author for the article.