Academic Emergency Medicine 2011; 18:S54–S58 © 2011 by the Society for Academic Emergency Medicine
Objectives: The purpose of this study was to determine the effect of an automated procedure logging (APL) system on the number of procedures logged by emergency medicine (EM) residents. Secondary objectives were to assess the APL’s effect on completeness and accuracy of procedure logging and to measure resident compliance with the system.
Methods: This was a before-and-after study conducted at a university-affiliated, urban medical center, with an annual emergency department census of >130,000. The EM residency is a 4-year, Residency Review Committee (RRC)-accredited program with 12 residents per year. We developed software to electronically search and abstract resident procedures documented in the electronic medical record (EMR) and automatically export them into a Web-based residency management system. We compared the mean daily number of procedures logged for two 6-month periods: October 1, 2009, to March 31, 2010 (pre-APL), and October 1, 2010, to March 31, 2011 (post-APL), using a two-sample t-test. We also generated a random sample of 231 logged procedures from both the pre- and post-APL time periods to assess for completeness and accuracy of data transfer. Completeness and accuracy in the pre- and post-APL periods were compared using Fisher’s exact test. Aggregate resident compliance with the system was also measured.
Results: The mean daily number of procedures logged increased by 168% (10.0 vs. 26.8, mean difference = 16.8, 95% confidence interval [CI] = 15.4 to 18.2, p < 0.001) after the implementation of APL. Procedures logged with the APL system were more complete (76% vs. 100%, p < 0.001) and more accurate (87% vs. 99%, p < 0.001). Most residents (42/48, 88%) used APL to log at least 90% of procedures. Only 4% of procedures eligible for automation were logged manually in the post-APL period.
Conclusions: There was a significant increase in the daily mean number of procedures logged after the implementation of APL. Recorded data were more complete and more accurate during this time frame. This innovative system improved resident logging of required procedures and helped our assessment of Accreditation Council for Graduate Medical Education (ACGME) Patient Care and Practice-Based Learning Competencies for individual residents.
Health information technology (HIT) is a rapidly growing sector of the health care field. Electronic medical records (EMRs) are being widely adopted in an effort to improve care and efficiency. The federal government has set forth a series of “meaningful use objectives” as part of an effort to both encourage and regulate the use of HIT.1,2 Research in the field has informed these regulations, including ways to use HIT to improve documentation. For instance, computerized physician order entry has been shown to reduce medication errors,3 and electronic charting enables clinical decision support, which has demonstrated improvements in patient care.4
Medical education also has the potential to benefit from advances in electronic documentation and HIT. In recent years, there has been increasing pressure on resident physicians, residency training programs, and employers to document the procedural training, experience, and competency of residents. Accurate recording of these data is required by the Accreditation Council for Graduate Medical Education (ACGME); however, the lack of documentation of the procedural experience of trainees is a frequent source of Residency Review Committee (RRC) citations,5 and the core competency–based evaluation of resident procedural experience has proved challenging. In addition, potential employers often request that graduating residents provide documentation of procedures performed during their training as part of the hiring and credentialing process.
The method by which residents and training programs track procedural experience varies widely (e.g., paper logs, personal card systems or spreadsheets, or electronic residency management systems such as we use), and handwritten, Web-based, and personal digital assistant (PDA)-based software systems have been plagued by a lack of reliability and accuracy, as well as poor resident compliance.6–9 One study found that only 60% of procedures actually performed were logged when done so via manual entry.7
This study sought to examine the effect of a novel, automated procedure logging (APL) system on the number of procedures logged by our emergency medicine (EM) residents. We hypothesized that automating the resident procedure logging process would result in an increase in the number of procedures logged, as well as improve the completeness and accuracy of procedure logging documentation.
This was a before-and-after study of resident procedures logged during two 6-month time periods: October 1, 2009, to March 31, 2010 (pre-APL), and October 1, 2010 to March 31, 2011 (post-APL). This study was deemed exempt from informed consent requirements by the Boston University Medical Center Institutional Review Board.
Study Setting and Population
The study was conducted at Boston Medical Center, an urban academic emergency department (ED) with a postgraduate year (PGY) 1–4 EM residency program enrolling 12 residents per year. The hospital is a Level I trauma center, and the ED has an annual census in excess of 130,000 visits. The ED EMR is Picis ED Pulsecheck (Wakefield, MA), and all training programs at Boston Medical Center rely on New Innovations (NI; Uniontown, OH) for residency management support (i.e., case logging, evaluations, monitoring conference attendance, duty hours, and general personnel tracking).
We developed APL software, which enables the seamless communication between our ED EMR and NI. The software retrieves procedural data identified in the EMR and automatically exports it into the procedure logs of individual residents. To record a procedure prior to the implementation of APL, residents were required to log in to NI, in addition to documenting the procedure in the EMR. This extra step, which interrupted normal work flow patterns, was seen as a critical obstacle in the successful and accurate documentation of procedural experience.
The APL uses structured query language to access the EMR database and generate a procedure list. Specifically, the process relies on three distinct, yet overlapping and complementary search strategies, which aim to ensure a complete and accurate data set: 1) while documenting a procedure in the EMR, residents can “flag” the procedure by using a drop-down menu added to the EMR procedure documentation screen (Figure 1); 2) current procedural terminology codes located in a patient chart are matched to residents who performed the procedures; and 3) a list of unstable patients who required resuscitation is generated using a combination of billing and visit data. Patient encounters are deemed “resuscitations” if they used one of three critical care/trauma bays in the ED or if the attending physician indicated on the chart that a patient should be billed for critical care time. Resuscitations are categorized as trauma or medical using natural language processing of the chief complaint, diagnosis, and admitting service. For example, trauma resuscitations are identified by chief complaints and/or diagnoses such as “burn,”“fall,”“fracture,”“gunshot wound,”“motor vehicle accident,”“pedestrian struck,” or “stab wound.”
Prior to the APL development, residents were required to use the EMR procedure-charting screens to document procedures for the medical record and facilitate billing. However, the EMR did not store this information in its relational database. As a result, there was no ability to link these data to individual resident procedure logs, requiring residents to separately log procedures in NI. To minimize the effect on resident workflow, a new and simple drop-down menu was added to the existing EMR procedure charting screen (Figure 1). This enables residents to simultaneously complete charting and log procedural data. A checkbox allows residents to indicate whether they performed, supervised, or observed the procedure. Entries in this new procedure logging section are stored in the EMR database, allowing the APL to generate reports on all procedures recorded.
Procedure data sets from each of the above methods are combined, and duplicates are automatically removed. Data are transmitted securely via the internet to NI. Any errors that occur during importation are reported via an automatically generated e-mail to the system administrator. The APL was written in the Java programming language and was designed to run on all major platforms, including Windows, Mac OS, and Linux.
The procedure logs of all EM residents were analyzed from the pre- and post-APL time periods. All data from the pre-APL period came from logs manually entered by residents; post-APL data were a combination of procedure logs automatically generated via the new software and any logs manually entered into NI during that time period. The mean daily number of procedures performed in each period was calculated and the average number of procedures logged per day was compared pre- and post-APL, using a two-sample t-test.
A secondary outcome compared the pre- and post-APL procedure logs for completeness and accuracy. Completeness was defined as inclusion of all of the following required data elements: date, medical record number, birthdate, and procedure type. Accuracy was defined as the absence of errors in all of the required elements. Records were excluded from the pre-APL sample of logs analyzed for completeness and accuracy if they were performed during off-service rotations (e.g., obstetrics), in an ED other than our university-affiliated one (e.g., a community ED in which the residents rotate), or in a simulation laboratory, as there was no way for the investigators to assess the accuracy of data recorded. From preliminary data we estimated the proportion of complete records pre-APL to be 0.77 (95% confidence interval [CI] = 0.70 to 0.83). To be conservative, we used the lower bound of the CI, 0.70, to determine an appropriate sample size. It was determined that 231 records pre- and post-APL were needed to detect a 10% increase from 0.70 with 80% power. To detect a 10% change from the baseline accuracy rate of 87%, we needed 112 records pre- and post-APL. The sample size needed for accuracy rate was lower than that needed for completeness and, as a result, the sample size for completeness was used for analysis. A random sample of 231 procedure logs was generated from both the pre- and post-APL time periods for a total of 462 records, and the completeness and accuracy of the records was reviewed by a single investigator (TS). A second trained investigator (AW), using explicit criteria for coding, independently reviewed a random sample of 10% of these records to determine inter-rater reliability. A kappa coefficient was subsequently calculated. We compared accuracy and completeness of pre- and post-APL procedure logs using Fisher’s exact test.
For all analyses, we considered those procedures with specific recommended numbers found in the ACGME guidelines,10 as well as several other procedures that the authors felt important to EM training (Table 1). Vaginal deliveries and cricothyrotomies were excluded from analysis due to curriculum changes in the training program that occurred between the two study periods preventing a valid and appropriate comparison.
|Procedure Name||Pre-APL||Post-APL||% Increase|
|Pediatric medical resuscitation*||0.11||0.52||375|
|Adult medical resuscitation*||1.69||7.03||317|
|Adult trauma resuscitation*||0.92||3.77||309|
|Pediatric trauma resuscitation*||0.14||0.32||127|
|Incision and drainage||0.77||1.41||83|
|Casting and splinting||0.21||0.32||55|
|Central venous access*||0.64||0.95||48|
Finally, we examined resident compliance with the APL system. Resident compliance was defined as the percentage of residents who used APL to log at least 90% of their ED procedures in the post-APL period. All data analysis was performed using SAS 9.2 (SAS Institute, Inc., Cary, NC). Results are reported with 95% CIs and p-values.
Residents logged an average of 168% more procedures per day during the postautomation period: 10.0 versus 26.8 (mean difference = 16.8, 95% CI = 15.4 to 18.2, p < 0.001). Postautomation procedure logs were more likely to be complete (76% vs. 100%, p < 0.001) and accurate (87% vs. 99%, p < 0.001). Errors in the preautomation group included incorrect medical record number (16), procedure date (7), date of birth (4), and procedure type (3). Errors in the postautomation group were the result of EMR data entry errors made by residents resulting in incorrect procedure type (2) and incorrect categorization of a resuscitation by APL (1). Inter-rater reliability was excellent for both completeness and accuracy with perfect agreement between the two reviewers for completeness and only one record being discordant for accuracy (κ = 0.85, 95% CI = 0.55 to 1.00).
The number of active residents (48) was the same during the pre- and post-APL periods; ED volume was also similar between the two periods. Most residents (42 of 48, 88%) used APL to log at least 90% of procedures. Only 4% (174 of the 4,249) of procedures eligible for automation were logged manually in the post-APL period.
We report a significant increase in the daily average number of procedures logged after the implementation of APL, suggesting that the new system facilitated resident procedure logging. In addition to increasing the number of procedures logged, we found that APL significantly improved the completeness and accuracy of procedural data entered. Resident compliance with the new system was high, enabling the residency leadership to gain a much more accurate understanding of the true procedural experience of residents. This novel software solution to a problem common to many residency training programs addresses concerns that resident procedure logging may not accurately reflect their true practice. While the EM RRC mandates that the procedural experience of residents be assessed, tracked, and monitored, there is no specific requirement as to how these data are obtained or recorded. Although previous studies have discussed the benefits of computerized tracking of resident procedures, systems that use PDA- and Web-based procedure tracking systems are suboptimal in that they require residents to either carry (and learn to use) another device or to access another software application through an additional log-in. Studies of these systems report poor compliance.6–9 In contrast, our compliance rate suggests residents successfully integrated APL into their normal work flow.
To our knowledge, APL represents the first reported attempt to automate and integrate procedure logging with EMR documentation. APL is unique in that it minimally alters the typical resident work flow, eliminates duplicate data entry, and requires very little active resident participation for the successful logging of a performed procedure. As increasing demands are placed on the time of both residents and residency leadership, any interventions that facilitate operational efficiency are of value to the field of EM.
This study was performed at a single institution, which may limit the generalizability of its findings. The software was specifically developed to enable communication between Picis ED Pulsecheck and NI. The ability to replicate the process with other EMRs and procedure logging applications is unknown; however, the system has the ability to export the procedure data into an Excel spreadsheet, perhaps enabling an interface with other procedure logging applications. Other training programs could collaborate with their own information technology departments to develop comparable integrated applications. It should be noted that 280 EDs nationwide use Picis ED Pulsecheck11 and 67% of EM training programs use NI.12 Procedures that are performed on off-service rotations, in our simulation laboratory, or in other EDs in which the residents rotate (e.g., a community ED affiliate that uses a different EMR) are not captured by APL. Although two identical 6-month periods were chosen for comparison and there was no appreciable change in patient volume, there is a small possibility that the measured change in procedure documentation was not a result of APL, but rather in a change in the frequency of performed procedures. Finally, as the study periods involve 2 academic years, the residents pre- and post-APL were not exactly the same (12 graduates were replaced by 12 new interns). However, there is no reason to suspect that the total number of procedures would change from year to year.
There was a significant increase in the daily mean number of procedures logged after the implementation of automated procedure logging. Recorded data were also more complete and more accurate during this time frame. This innovative system improved resident logging of required procedures and helped our assessment of the ACGME Patient Care and Practice-Based Learning competencies for individual residents. Other training programs might benefit from a similar system.
The authors would acknowledge Mark B. Mycyk, MD, and James A. Feldman, MD, MPH, for their guidance in the preparation and review of the manuscript and Layla Rahimi, BA, for administrative support.