SEARCH

SEARCH BY CITATION

Keywords:

  • emergency care information systems;
  • public health surveillance

Abstract

  1. Top of page
  2. Abstract
  3. Methods
  4. Results
  5. Discussion
  6. Limitations
  7. CONCLUSIONS
  8. Acknowledgments
  9. References
  10. Supporting Information

The North Carolina Disease Event Tracking and Epidemiologic Collection Tool (NC DETECT) is a near-real-time database of emergency department (ED) visits automatically extracted from hospital information system(s) in the state of North Carolina. The National Hospital Ambulatory Medical Care Survey (NHAMCS) is a retrospective probability sample survey of visits to U.S. hospital EDs. This report compares data from NC DETECT (2006) with NHAMCS (2005) ED visit data to determine if the two data sets are consistent.

Proportions, rates, and confidence intervals (CIs) were calculated for ED visits by age and gender; arrival method and age; expected source of payment; disposition; hospital admissions; NHAMCS top 20 diagnosis groups and top five primary diagnoses by age group; International Classifications of Disease, 9th revision, Clinical Modification (ICD-9-CM) primary diagnosis codes; and cause of injury.

North Carolina DETECT captured 79% of statewide ED visits. Twenty-eight persons for every 100 North Carolina residents visited a North Carolina ED that reports to NC DETECT at least once in 2006, compared to 20% nationally. Twenty-seven percent of ED visits in North Carolina had private insurance as the expected payment source, compared with 40% nationwide. The proportion of injury-related ED visits in North Carolina is 25%, compared to 36.4% nationally. Rates and proportions of disease groups are similar.

Similarity of NC DETECT rates and proportions to NHAMCS provides support for the face and content validity of NC DETECT. The development of statewide near-real-time ED databases is an important step toward the collection, aggregation, and analysis of timely, population-based data by state, to better define the burden of illness and injury for vulnerable populations.

The North Carolina Disease Event Tracking and Epidemiologic Collection Tool (NC DETECT)1 is the Web-based, near-real-time early event detection and public health surveillance system of the North Carolina Division of Public Health. It receives data on at least a daily basis from emergency departments (EDs) throughout the state of North Carolina. It is the most comprehensive near-real-time statewide ED database in the United States. Web-based reports are easily accessed by, and specialized alerts are provided to, a wide variety of local and state clinical, administrative, and health policy users.

The National Hospital Ambulatory Medical Care Survey (NHAMCS)2 is part of the ambulatory component of the National Healthcare Survey and is a retrospective national probability sample survey of visits to U.S. hospital outpatient clinics and EDs. NHAMCS estimates and extrapolates summary descriptive measures about U.S. EDs using a manually extracted, national probability sample survey to produce estimates of national data. Its purpose is to provide a retrospective national overview, and trend analysis, of ED visit characteristics.

The purpose of this study is to determine if a statewide near-real-time ED database, which uses secondary data from hospital administrative and clinical systems, has face and content validity and consistency of measures when compared to a national survey about U.S. EDs. We compared ED visit data from NC DETECT (2006) with that of NHAMCS (2005 Advance Data Set, released June 29, 2007)2 and, using NHAMCS as the reference standard, determined if the two data sets were consistent.

Methods

  1. Top of page
  2. Abstract
  3. Methods
  4. Results
  5. Discussion
  6. Limitations
  7. CONCLUSIONS
  8. Acknowledgments
  9. References
  10. Supporting Information

Study Design

This was a descriptive study. The study was exempted by the University of North Carolina School of Medicine Institutional Review Board and approved by the North Carolina Division of Public Health.

Study Setting and Population

North Carolina DETECT and NHAMCS are each comprehensive data sets with different methodologies for data collection, abstraction, and validation (Table 1). North Carolina has a comprehensive data collection effort called NC DETECT, with data from the EDs in North Carolina, the Carolinas Poison Center, the North Carolina Prehospital Medical Information System, the Piedmont Wildlife Center, and the North Carolina State College of Veterinary Medicine Laboratories. For this study, we used data received by NC DETECT from EDs. The state of North Carolina legislated collection of visit-level data from all 24/7, acute care, hospital-based EDs in the state for timely public health surveillance3 beginning January 1, 2005. All ED data collected under this mandate are owned by the North Carolina Division of Public Health.

Table 1.   Comparison of NC DETECT and NHAMCS
 NC DETECTNHAMCS
  1. AMCS = Ambulatory Medical Care Survey; CC = chief complaint; DEEDS = Data Elements for Emergency Department Systems; E-code = external cause of injury code; ED = emergency department; HL-7 = Health Level 7; ICD-9-CM = International Classification of Diseases, Ninth Revision, Clinical Modification; NC DETECT = North Carolina Disease Event Tracking and Epidemiologic Collection Tool; NCHS = National Center for Health Statistics; NHAMCS = National Hospital Ambulatory Medical Care Survey.

Data Source24/7 acute care hospital EDs in North Carolina, includes free-standing 24/7 acute care EDs operated by hospitals. Near-real-time data extracted from hospital information systems, estimated 79% of records reported in 2006. 69/112 EDs 1/1/06 83/112 EDs 7/1/06 90/112 EDs 12/31/0624/7 EDs with ≥6 hospital beds; national sample of representative hospitals in 50 states and District of Columbia. Retrospective probability sample survey. 367 U.S. hospitals surveyed for 4 weeks every 15 months; manual medical record abstraction to collect data.
ExclusionsMilitary, VA, federal hospitalsMilitary, VA, federal, and hospitals with <6 beds
ScopeNear-real-time and near-population-based acute illness and injury for the state of North Carolina.Retrospective national survey estimate; cannot be used for state, regional, or zip code estimates.11
Nonreporting hospitalsNonreporting hospitals account for an estimated 21% of ED visits (62%–80% of North Carolina EDs were included over the study period, capturing 79% of estimated statewide ED visits).458 hospitals selected for 2005 NHAMCS. 386 eligible EDs. 352, or 91% participation. Sample weights inflated for survey nonresponse.
Reporting hospitals with some empty data fieldsRecord must have at least patient ID, visit ID, arrival date, and time. No imputation of empty data fields.Periodic log review to encourage completeness. Some responses imputed for empty data fields.
Data collectionHospitals standardize data to DEEDS5 (Data Supplement S8) prior to transmission; data validated and received securely every 12 hours in HL-7 format to secure server; payer 90% complete; disposition 85% complete.Data abstracted from medical record onto patient record form modeled after the National Ambulatory Medical Care Survey Patient Record form (Data Supplement S7) by hospital employees; in 2005, 35% required U.S. Bureau of Census workers to abstract the hospital record onto the AMCS form.
Data elementsDEEDS5 compliant where applicable (see Data Supplement S8); limited to data elements approved by the North Carolina state legislature. Must be already electronically captured, with no additional data collection requiredReason for visit classification19 (see Data Supplement S7). No restrictions on data elements collected.
Injury definitionFor Data Supplement S6: primary ICD-9-CM diagnosis code in the 800–999 range. For Table 4: 1) any one of 11 ICD-9-CM codes that were in the range 800–999 OR 2) any one of five E-codes recorded. For Data Supplement S6: primary ICD-9-CM diagnosis code in the 800–999 range. For Table 4: 1) any of three reasons for visits were in the injury module; OR 2) any of three diagnoses were in the ICD-9-CM range 800–999; OR 3) any E-code recorded.
Medical codingDone by each hospital for their own operational purposes; at 12 weeks 86% had ≥1 diagnosis; average 3.5 diagnosis codes/visit; at 24 hours 89% of visits have CC and at 14 days 100% have chief complaint; 94% of ICD-9-CM codes 800–999 have an E-code; of those with an E-code, 15% have no diagnosis code 800–999.Reason for visit and diagnoses coded from free text by NCHS contracted medical coders; reason for visit classification uses a reason for visit classification for ambulatory care;19 10% of visits had independent coding and verification procedure; NCHS converts free text cause of injury to E-code.
Quality controlValidation business rules specified for each data element; edits for code ranges, duplications, and inconsistencies.Computer and clerical edits for code ranges and inconsistencies; keying error rates tracked.
ConfidentialityPatient or institutional identifiers available only to authorized users; account number and medical record number received but are encrypted and stored separately; permission and secure passwords for data review by individual institutions and public health officers for their own institutions.No patient or institutional identifiers; ED weighting to calculate department-based estimates.
TrainingAssistance provided as needed to allow hospitals to meet mandated requirements for ED data transmission.Induction of hospital participants and “train-the-trainer” hierarchy for local abstractors.

The data source for the NHAMCS 2005 Emergency Department Summary is a sample survey designed to statistically represent national data on ED visits from 50 states and the District of Columbia. NHAMCS is conducted by the National Center for Health Statistics and collects data on physician office visits and outpatient department visits, as well as ED visits. Data collection is authorized by the Public Health Service Act.4 Information is publicly available about 1.5 years after data collection.

Study Protocol

Data from NC DETECT were obtained by querying the NC DETECT ED record database for records of all ED visits from reporting hospitals in the state of North Carolina between January 1 and December 31, 2006. EDs in North Carolina are mandated to transmit data at least once every 24 hours to a central data aggregator. Hospitals standardize their data where applicable to the Data Elements for Emergency Department Systems (DEEDS).5 NC DETECT downloads data via HTTPS from the central data aggregator twice daily. These data are processed and loaded into NC DETECT’s database using an extraction, transformation, and loading tool (Pervasive Software, Austin, TX). For purposes of this study, ED data were extracted from the NC DETECT database using structured query language and results were output to a delimited flat text file. This file was then imported into SAS (SAS Institute, Cary, NC) for analysis. NC DETECT summary measures were then calculated and compared with 2005 U.S. national estimates.2 Comparisons were made for the following parameters: ED visits; ED visits by age and gender; ED visits by arrival method and age; ED visits by expected source of payment; ED visits by disposition, hospital admissions from the ED by age group; NHAMCS cluster groupings of top 20 diagnosis groups; NHAMCS cluster groupings by age and top five primary diagnoses; ED visits by International Classification of Diseases, 9th revision, Clinical Modification (ICD-9-CM) primary diagnosis code clusters; and ED visits for coded cause of injury. Data elements were not compared if either data system did not collect the element or if comparisons were not possible due to inherent data definition or collection differences. Reporting hospitals may submit records with some empty data fields, which is indicated as such in the tables presented.

Data Analysis

Percentages, rates, and confidence intervals (CIs) were calculated for both data sets. CIs for NHAMCS were calculated from the published standard errors.2 The estimated number of actual statewide ED visits in North Carolina for 2006 was calculated by taking the average visits/day for all EDs in North Carolina in 2006. All hospitals in North Carolina with EDs are required to provide such basic descriptive information as part of the legislative mandate.3 We then summed the averages to determine the anticipated visits/day if 100% of North Carolina EDs had reported. Using the sums of the same averages, we multiplied by 365 to obtain the estimated 3,768,215 visits/year. By this method, the estimated total number of statewide ED visits (3,768,215) would have been obtained if 100% of hospitals had reported, and the actual ED visits reported (2,976,890) thus represented 79% of total statewide visits. We scaled the 2006 NC DETECT rates upward by dividing crude rates by 0.79, the estimate of the proportion of total state ED visits captured, to account for ED visits made to nonreporting hospitals. In this fashion, rates better approximate the actual annual statewide rates of ED visits. We computed North Carolina rates based on the 2006 North Carolina population.6 CIs were calculated for NC DETECT percentages and rates using the method of Daly.7 NHAMCS uses a sampling scheme, whereas NC DETECT data represent the entire ED visit experience of the state. CIs for both data sets reflect the relative precision of each data set rather than statistical significance.

Definitions for the following key terms were already in place and were the same in NC DETECT as in NHAMCS, but are stated here for clarity as follows: ED visit = ED encounter resulting in capture of patient demographics or ED registration; ED patient = a unique person registering for care in an ED at least once during the reporting period, represented by one or more ED visits; and admission = hospital admission: patient is admitted from the ED to an inpatient hospital bed. “Admission” excludes observation unit placement, transfers, or deaths in the ED. Diagnosis code groups were assembled by applying NHAMCS diagnosis codes to NC DETECT raw data. We manually developed crosswalks to NC DETECT data. We used no abstraction tools, did not merge or separate fields, and made no alterations to the data set.

We categorized NC DETECT data using the NHAMCS clusters of ICD-9-CM codes, to compare proportions of ED visits falling into the NHAMCS top 20 primary diagnosis groups. We calculated proportions for NC DETECT based on records with entries in the relevant fields, with a separate tabulation for empty data fields, unless otherwise specified. Proportions and CIs are presented in tables where appropriate.

Results

  1. Top of page
  2. Abstract
  3. Methods
  4. Results
  5. Discussion
  6. Limitations
  7. CONCLUSIONS
  8. Acknowledgments
  9. References
  10. Supporting Information

About 28 individuals per every 100 North Carolina residents visited an ED at least once in 2006, compared to the U.S. average of 20% in 2005 (Table 2). The proportions of ED visits by age and gender are consistent in both data sets (Data Supplement S1, available as supporting information in the online version of this paper). The highest number of ED visits in both was for females aged 25–64 at 28.6 and 26.5% of the total in NC DETECT and NHAMCS, respectively.

Table 2.   ED Visit Summary
 NC DETECT 2006NHAMCS 2005
  1. *79% of North Carolina ED visits reported to NC DETECT in 2006.

  2. †Total ED visits, 3,768,215, scaled to North Carolina 2006 population.

  3. CI = confidence interval; ED = emergency department; NC DETECT = North Carolina Disease Event Tracking and Epidemiologic Collection Tool; NHAMCS = National Hospital Ambulatory Medical Care Survey.

Total ED visits, hospitals reporting*†2,976,89033,605 visits sampled and extrapolated to 115,323,000
Averages visits per reporting ED37,44530,388
Visit rates all EDs42.7 visits/100 persons/year 39.6 visits/100 persons/year
At least one ED visit/year227.8 visits/100 persons visited the same ED one or more times 20% of U.S. population visited the same ED one or more times
% Female (95% CI)55.4 (55.3, 55.5)53.9 (53.1, 54.7)
% Male (95% CI)44.5 (44.5, 44.6)46.1 (45.3, 46.9)
% Ambulance transport (95% CI)13.7 (13.7, 13.7)15.5 (14.3, 16.7)
% Self-pay (95% CI)23.4 (23.4, 23.5)16.7 (15.1, 18.3)
% Private insurance (95% CI)27.1 (27.0, 27.1)39.9 (37.5, 42.3)
% Admitted to hospital (95% CI)12.6 (12.5, 12.6)12.0 (10.8, 13.2)
% Injury visits (95% CI)25.0 (24.9, 25.0)36.4 (35.4, 37.4)

For about 18% of visits, the data field “arrival method” is empty in the NC DETECT dataset (Data Supplement S2, available as supporting information in the online version of this paper), and this proportion is about equally distributed among all age groups. For those 65 years and older, the proportion of ambulance arrivals, 34.9% in NC DETECT and 36.4% in NHAMCS, is about twice as large as all other age groups and is similar to that reported by Shah et al.,8 when reviewing NHAMCS 1997–2000 data.

There are differences in insurance status between the North Carolina and the U.S. data (Data Supplement S3, available as supporting information in the online version of this paper). For 27% of ED visits in North Carolina, private insurance is the expected source of payment, while in the United States, the proportion is 39.9%. In North Carolina, 23.4% of ED visits are characterized as “self-pay” or “no insurance,” and in the United States, it is 16.7%. The proportion of visits covered by Medicaid, State Children's Health Insurance Program (SCHIP), or Medicare is similar, about 43% for NC DETECT and 44% for NHAMCS. In North Carolina, for nearly half of ED visits, the expected source of payment is from federal or state sources, and the proportion of ED visits covered by private insurance (27%) is similar to that of self-pay/no insurance (23.4%).

Proportions of hospital admissions, arrival method for visits resulting in admission, and admissions by age group are similar in both data sets (Data Supplement S4, available as supporting information in the online version of this paper). Abdominal pain and chest pain were in the top three primary diagnosis groups in both data sets. The burden of disease to NC EDs represented by abdominal and chest pain is over 260,000 ED visits per year (Table 3).

Table 3.   Top 20 Primary Diagnosis Groups, Comparing NC DETECT and NHAMCS
Diagnosis ClusterICD-9-CM CodesNC DETECT* Billing Codes, % of Nonmissing Data (95% CI)NHAMCS, Abstracted Codes, % of Visits, No Missing Data Reported (95% CI)
  1. *NC DETECT data are percentages of records containing a primary ICD-9-CM code, because NHAMCS reported no missing data fields. For NC DETECT, 15.7% of ED visits were missing primary ICD-9-CM diagnosis code.

  2. CI = confidence interval; ICD-9-CM = International Classification of Diseases, Ninth Revision, Clinical Modification; NC DETECT = North Carolina Disease Event Tracking and Epidemiologic Collection Tool; NHAMCS = National Hospital Ambulatory Medical Care Survey; OM = otitis media; URI = upper respiratory infection; UTI = urinary tract infection.

Contusion, intact skin surface920–9243.1 (3.1, 3.2)4.2 (3.8, 4.6)
Abdominal pain789.04.6 (4.5, 4.6)4.0 (3.6, 4.4)
Chest pain786.54.2 (4.2, 4.3)3.8 (3.4, 4.2)
Acute URI, excluding pharyngitis460–461; 463–4663.4 (3.4, 3.5)3.7 (3.3, 4.1)
Open wound excluding head874–8972.4 (2.4, 2.4)3.5 (3.1, 3.9)
Spinal disorder720–7743.6 (3.5, 3.6)2.5 (2.3, 2.7)
Cellulitis and abscess681–6821.7 (1.7, 1.8)2.3 (2.1, 2.5)
Sprains, strains excluding ankle and back840–844; 845.1; 8481.5 (1.5, 1.5)2.2 (2.0, 2.4)
Sprains, strains neck and back846, 8472.2 (2.2, 2.2)2.2 (2.0, 2.4)
Fractures, excluding lower limb800–8191.7 (1.7, 1.7)2.0 (1.8, 2.2)
OM and Eustachian tube disorders381–3821.4 (1.4, 1.4)1.9 (1.5, 2.3)
Open wound of head870–8731.6 (1.6, 1.6)1.9 (1.7, 2.1)
Rheumatism excluding back725–7292.2 (2.2, 2.2)1.6 (1.4, 1.8)
UTI, unspecified599.01.5 (1.5, 1.5)1.6 (1.4, 1.8)
Asthma4931.3 (1.3, 1.3)1.5 (1.3, 1.7)
Chronic bronchitis, unspecified490–4911.0 (1.0, 1.0)1.4 (1.2, 1.6)
Superficial injuries910–9191.1 (1.0, 1.1)1.4 (1.2, 1.6)
Unspecified viral/chlamydial infection079.90.9 (0.9, 0.9)1.4 (1.2, 1.6)
Acute pharyngitis4621.5 (1.5, 1.5)1.4 (1.2, 1.6)
Heart disease, excluding ischemic391–392.0; 393–398; 402; 404; 415–416; 420–4291.8 (1.7, 1.8)1.4 (1.2, 1.6)
All other diagnoses 57.2 (57.1, 57.2)54.0 (53, 55)

Data Supplement S5 (available as supporting information in the online version of this paper) lists the top five NHAMCS ad hoc cluster groups by age group. The ordering by age group is similar when NC DETECT proportions are calculated by using all records, as well as by using only those records with complete data fields.

Data Supplement S6 (available as supporting information in the online version of this paper) lists ED visits by primary ICD-9-CM diagnosis code cluster. In both data sets, the top two categories are “Ill-defined signs and symptoms” and “Injury/poisoning.”“Ill-defined signs and symptoms” (780–799) is a heterogeneous collection of disorders. For example, it includes syncope and collapse (780.2), fever (780.6), fussy infant (780.91), headache (784.0), palpitations (785.1), shock without mention of trauma (785.5), cardiogenic shock (785.51), septic shock (785.52), and dyspnea (786.0).

Table 4 lists injury-related visits, defined broadly as any ED visit that has any ICD-9-CM final diagnosis code in the 800–999 range, or any coded cause of injury (ICD-9-CM external cause of injury codes [E-codes]). The most common unintentional injuries in both data sets were falls, motor vehicle traffic collisions, and being struck by objects or persons.

Table 4.   Distribution of Injury-related Visits, NC DETECT vs. NHAMCS
 NC DETECT VisitNHAMCS Visit
  1. Data are reported as % (95% CI).

  2. *Based on injury definition in Table 1, for Table 4.

  3. †Data based on ICD-9-CM 800–999 with an E-code.

  4. E-code = external cause of injury code; ICD-9-CM = International Classification of Diseases, Ninth Revision; Clinical Modification; NC DETECT = North Carolina Disease Event Tracking and Epidemiologic Collection Tool; NHAMCS = National Hospital Ambulatory Medical Care Survey.

All injury-related visits*25.0 (24.9, 25.0)36.4 (35.4, 37.4)
Injury-related visits missing an E-code9.8 (9.7, 9.9)17.8 (16.0, 19.6)
Unintentional injuries†78.2 (78.1, 78.3)67.7 (65.9, 69.5)
 Falls21.3 (21.2, 21.4)20.8 (19.6, 22.0)
 Motor vehicle traffic13.2 (13.1, 13.2)10.2 (9.4, 11.0)
 Struck against or struck accidentally by objects or persons9.2 (9.1, 9.2)7.90 (7.3, 8.5)
 Overexertion and strenuous movements8.3 (8.2, 8.4)4.3 (3.7, 4.9)
 Cutting or piercing instruments or objects5.8 (5.8, 5.9)6.0 (5.4, 6.6)
 Natural and environmental factors4.3 (4.2, 4.3)4.8 (4.0, 5.6)
 Poisoning1.3 (1.2, 1.3)2.2 (1.8, 2.6)
 Foreign body1.6 (1.6, 1.7)2.2 (1.8, 2.6)
 Fire and flames, hot substance or object, causative or corrosive material, steam1.1 (1.1, 1.1)1.4 (1.2, 1.6)
 Machinery0.4 (0.4, 0.4)0.7 (0.5, 0.9)
 Pedal cycle, nontraffic1.6 (1.5, 1.6)1.0 (0.8, 1.2)
 Motor vehicle, nontraffic0.1 (0.1, 0.1)0.8 (0.6, 1.0)
 Other transportation0.3 (0.3, 0.3)0.3 (0.1, 0.5)
 Suffocation0.1 (0.1, 0.1)Not reliable
 Caught accidentally in or between objects1.2 (1.2, 1.2)1.0 (0.8, 1.2)
 Mechanism unspecified6.1 (6.0, 6.1)Not reliable
 Drowning, firearms, and other mechanism2.3 (2.3, 2.4)3.7 (3.3, 4.1)
Intentional injuries (% of all injury related visits)4.9 (4.8, 4.9)5.2 (4.6, 5.8)
 Assault
  Unarmed fight or brawl, striking by blunt or thrown object2.0 (2.0, 2.1)2.7 (2.3, 3.1)
  Cutting or piercing instrument0.3 (0.3, 0.4)0.3 (0.1, 0.5)
  Firearms, explosives, other1.3 (1.3, 1.4)1.1 (0.9, 1.3)
 Self-inflicted
  Poisoning by solid or liquids, gases, vapors0.8 (0.7, 0.8)0.7 (0.5, 0.9)
  Cutting and piercing, other, unspecified0.3 (0.3, 0.3)0.3 (0.1, 0.5)
  Other causes of violence0.1 (0.1, 0.1)Not reliable
Injuries of undetermined intent0.4 (0.4, 0.4)0.6 (0.4, 0.8)
Adverse effects of medical treatment6.7 (6.7, 6.8)4.4 (3.8, 5.0)

Discussion

  1. Top of page
  2. Abstract
  3. Methods
  4. Results
  5. Discussion
  6. Limitations
  7. CONCLUSIONS
  8. Acknowledgments
  9. References
  10. Supporting Information

General Features of NC DETECT

This study demonstrates that a statewide near-real-time ED database (NC DETECT), which uses secondary data, has face and content validity and summary descriptive measures that are consistent with a trusted, mature national sample survey (NHAMCS) that estimates similar summary descriptive measures about U.S. EDs. Despite the differences in data sources, collection, abstraction, and validation, the summary measures are consistent. This is to be expected from well-designed systems, even with data obtained from a mixture of clinical and administrative sources.9

While a head-to-head comparison of both systems is not possible, the fact that NC DETECT has face and content validity with NHAMCS, and that the systems’ summary measures are consistent with each other, provides evidence that a system such as NC DETECT works and provides data of reasonable accuracy and precision.

North Carolina DETECT data represent nearly all ED visits in North Carolina in 2006, and data for 2008, with 98% of EDs reporting, will include over 4,000,000 ED visits.10 NC DETECT therefore provides the closest approximation of population-based information available about the burden of acute illness and injury for the state of NC.

While NC DETECT and NHAMCS are consistent with each other, they are by no means identical. Both NC DETECT and NHAMCS provide information about ED visits, but with different goals. NHAMCS uses a sampling scheme to produce a national snapshot of patient, hospital, and visit characteristics and provides retrospective trends over time of national ED utilization. The NHAMCS survey is confidential; individual hospitals are not identifiable and data cannot be obtained for a specific state or region.11 Statewide and regional rates and proportions may vary from national estimates, and the burden of specific types of diseases and injury, and the vulnerable populations, would be expected to vary from state to state.

A system like NC DETECT, based on near-real-time data, provides immediately available and actionable results in a more timely manner than survey-based estimates. Aggregations need to be done periodically to be useful and, depending on the measures assessed, NC DETECT can, and does, provide information on a daily, monthly, quarterly, and yearly basis.12

Because NC DETECT receives data from both hospital administrative and clinical systems and provides standardized mapping of data elements from all institutions, its method is generalizable to other state or multihospital systems. A system like NC DETECT receives data that are generated electronically as part of normal hospital operations and is scalable and interoperable.

We compared 2006 NC DETECT data with 2005 NHAMCS data because the latter data set was first publicly available in June 2007. It would not be possible to do a timely comparison between NC DETECT and NHAMCS because of the time lag of publication of NHAMCS. We compared 2006 NC DETECT data because an average of 79% of North Carolina ED visits were captured data in 2006, and 2006 was the first year that NC DETECT was able to achieve reasonably comprehensive data reporting.

NC DETECT rate estimates were scaled upward to approximate data from nonreporting hospitals (see Methods). Assumptions for determining the rate denominator, i.e., the population estimate, differ for NC DETECT and NHAMCS. For the NC population, NC DETECT uses North Carolina demographic data,6 which are the midyear estimates of population including institutionalized and military populations. To calculate visit rates by age and gender, NHAMCS uses an estimate of the civilian noninstitutionalized population. Prisoners, the active military, and nursing home residents are excluded.13,14 For the numerator of rate calculations, NC DETECT uses all reported ED visits without exclusions, scaled upward to account for records from nonreporting hospitals. NHAMCS excludes records when more than half of data elements are empty.15

In Table 4, “Distribution of Injury-related Visits, NC DETECT vs. NHAMCS,” the injury definition varies between the two data sets (Table 1). For NC DETECT, 86.7% of visits with an injury diagnosis code 800–999 contain an E-code; for NHAMCS, 82.2%. Regardless of definition, the proportion of injury visits identified through NC DETECT in North Carolina is lower than NHAMCS. It should be noted that NHAMCS data collection identifies a much larger proportion of ED visits as injuries (25% in NC DETECT, 36.4% in NHAMCS) when the definition of injury-related visit is expanded16,17 (Tables 1 and 4). Reasons for the differences in injury-related ED visits are unknown, but could be due to coding or classification error, empty data fields, or fewer injuries resulting in ED visits in North Carolina than in the United States as a whole. This requires further investigation.

Some data element comparisons cannot be made between NC DETECT and NHAMCS because of specific data elements not collected in one of the systems, inherent data definition or collection differences, or lack of standardization of data elements between systems. The most important of these are “reason for visit” or chief complaint (CC), triage acuity, CC and diagnosis clusters, and race/ethnicity. The definition of CC has not been standardized, and CC terminology has not been adopted by standards organizations; thus, individual EDs develop their own methods for recording CC. CC is most often recorded as free text or with local terminology.18 NHAMCS calls this element “reason for visit” (Data Supplement S7, available as supporting information in the online version of this paper). CC data are then classified according to the “reason for visit classification”19 and organized into clusters that are unique to NHAMCS. While some processing and standardizing of terms is done for CC within NC DETECT for purposes of assigning syndrome classifications, no aggregation of CC data into clusters is currently performed. “Immediacy with which patient should be seen” is categorized by NHAMCS as “immediate,”“emergent” (1–14 minutes), “urgent” (15–60 minutes), “semiurgent” (61–120 minutes), or “nonurgent” (121 minutes–24 hours). The immediate category was first used in the 2005 Emergency Department Summary.2 NC DETECT collects but does not use data on triage acuity because there is no standardized triage system used across all EDs in North Carolina nor is there a valid method for aggregating triage acuity measures from self-defined three-, four-, or five-level categorizations. NHAMCS also determines rates of ED utilization by race/ethnicity. Race and ethnicity are not required data elements for the legislated collection of ED data in NC DETECT. Previous attempts to collect and use race and ethnicity data in North Carolina revealed that most EDs categorize race or ethnicity by self-report or administrative assignment. In addition, race and ethnicity data are often not captured and, when captured, are coupled at the hospital level and cannot be separated, making DEEDS compliance impossible. The accuracy of race and ethnicity data extracted directly from hospital information systems in North Carolina cannot be validated, and it is also impossible to determine if federal standards are applied to the reporting of these data elements in administrative databases. As a result, no effort was made to include them in the legislated mandate for the collection of ED data in North Carolina. See Data Supplement S8 (available as supporting information in the online version of this paper) for a complete list of data elements collected in NC DETECT ED data.

Limitations

  1. Top of page
  2. Abstract
  3. Methods
  4. Results
  5. Discussion
  6. Limitations
  7. CONCLUSIONS
  8. Acknowledgments
  9. References
  10. Supporting Information

The major limitations of a Web-based, near-real-time system like NC DETECT are nonreported data elements from participating hospitals, limited data elements collected, nonreporting hospitals, and institutionally dependent data collection. These limitations are facts of life for a system like NC DETECT. However, despite these challenges, the NC DETECT system provides timely and useful data with high precision for a variety of purposes.

Nonreported Data Elements from Participating Hospitals

NHAMCS imputes data elements not reported from participating hospitals, but NC DETECT captures near-real-time data and does not impute information for empty data fields. NC DETECT depends on each institution for the provision of data elements.

Empty data fields in NC DETECT average 8% for the data elements reported in this paper and 13% for all data elements collected by NC DETECT. Rate calculations would therefore result in falsely low rates of disease in specific population groups when compared to NHAMCS. Consequently, percentage differences were used for most comparison purposes in this study. In NC DETECT, the largest proportion of empty data fields was for the element “arrival method,” which was empty for 17.7% of NC DETECT visits compared with 4.4% for NHAMCS. A multicenter study of pediatric EDs compared data element retrieval from an administrative database versus data abstraction by trained abstractors.20 In that study, mode of arrival was empty in 25.5% and E-codes in 27% of visit records using the administrative database. NC DETECT compares favorably with empty mode of arrival data field at 17.7% and empty E-code data field at 11%.

Data Supplement S6, “Emergency department visits by primary diagnosis code clusters,” defines injury/poisoning as primary ICD-9-CM Code 800–999. For 13.1% of the NC DETECT ED visits in Data Supplement S6, there is an empty ICD-9-CM code data field. If injury visits are disproportionately lacking diagnosis codes, the proportion of injury visits would be falsely low. Clearly a future goal for NC DETECT is to encourage improved hospital injury diagnosis coding.

Systems such as NC DETECT can continue to improve data quality by educating and providing feedback to participating institutions, demonstrating both the utility of accurate information and the rewards for capturing quality information such as improved coding, achievement of quality and performance measures, and optimal use of information technology.9,21,22

Limited Data Elements Collected

Emergency department data collection in North Carolina is legislated,4 but the mandated data set is at this time somewhat more limited than the standard Health Insurance Portability and Accountability Act (HIPAA) limited data set. In addition, the mandate requires hospitals to submit only those data elements that are already electronically available within the hospital’s information system.

Nonreporting Hospitals

Over the study period, NC DETECT collected data from 62% to 80% of the EDs in North Carolina, with the number of EDs contributing data increasing throughout the collection period. Seventy-nine percent of statewide ED visits were captured, and therefore, nonreporting hospitals accounted for an estimated 21% of ED visits. NHAMCS reported 91% survey participation, and sample weights were inflated for survey nonresponse.

Institutionally Dependent Data Collection

Under the North Carolina mandate, hospitals automate a process to extract designated data elements from their electronic health information systems, translate them where necessary to DEEDS format,5 and submit them to a data aggregator which then provides a data file to NC DETECT twice a day. While hospitals are not required to manually enter data or to collect any additional data beyond what they already capture electronically, this system does impose a burden on the hospital both to establish participation and to maintain a viable data feed. NHAMCS provides trained data abstractors to collect their data from hospital records. With this system, hospitals are also not required to manually enter data or collect additional data elements, and NHAMCS is not dependent on hospitals to establish and maintain a regular data feed.

International Classification of Diseases, 9th Revision, Clinical Modification coding captured by NC DETECT was provided by each hospital as part of normal operations, while NHAMCS uses contracted medical coders who abstract information from the medical record, and there is independent coding verification. While it is likely that NHAMCS thus has less variation in coding, the proportions of disease groupings are similar in both data sets, suggesting that the variance in individual institutional coding patterns and practices is tolerable for data aggregation purposes. Owing to lack of resources, there is no independent coding and verification procedure routinely performed on NC DETECT data.

CONCLUSIONS

  1. Top of page
  2. Abstract
  3. Methods
  4. Results
  5. Discussion
  6. Limitations
  7. CONCLUSIONS
  8. Acknowledgments
  9. References
  10. Supporting Information

North Carolina DETECT, a statewide near-real-time ED database that uses secondary data from hospital administrative and clinical databases, has face and content validity and produces summary measures that are consistent with the 2005 NHAMCS. North Carolina DETECT data are timely and precise and can produce aggregations of clinical data that are useful for identifying the burden of acute illness and injury in the state. The major limitations of 2006 NC DETECT are nonreported data elements from participating hospitals, incomplete population-based data due to nonreporting hospitals, limited data elements collected, and reliance on individual institutional coding practices. A statewide, near-real-time ED database like NC DETECT is an important component for collection, aggregation, and analysis of statewide population-based data, and development of similar systems for other states should be encouraged.

Acknowledgments

  1. Top of page
  2. Abstract
  3. Methods
  4. Results
  5. Discussion
  6. Limitations
  7. CONCLUSIONS
  8. Acknowledgments
  9. References
  10. Supporting Information

The authors acknowledge and thank Catharine W. Burt, EdD, Chief, Ambulatory Care Statistics Branch, National Center for Health Statistics, Centers for Disease Control and Prevention, for her review of the draft manuscript and her insightful comments. Stephen W. Marshall, PhD, Associate Professor, Department of Epidemiology, University of North Carolina at Chapel Hill School of Public Health, was the statistical consultant.

References

  1. Top of page
  2. Abstract
  3. Methods
  4. Results
  5. Discussion
  6. Limitations
  7. CONCLUSIONS
  8. Acknowledgments
  9. References
  10. Supporting Information
  • 1
    The North Carolina Disease Event Tracking and Epidemiologic Collection Tool (NC DETECT). Available at: http://www.ncdetect.org. Accessed Nov 11, 2008.
  • 2
    Nawar E, Niska RW, Xu J. National Hospital Ambulatory Medical Care Survey: 2005 Emergency Department Summary. Advance Data from Vital and Health Statistics, US Department of Health and Human Services, Centers for Disease Control and Prevention, National Center for Health Statistics. #386. June 29, 2007.
  • 3
    North Carolina State Government. Emergency Department Data Reporting. Section 10.34(b), Article 22 of Chapter 130A-480 of the North Carolina General Statutes (H1414 SL 2004-124). Available at: http://www.ncga.state.nc.us/EnactedLegislation/Statutes/rtf/ByArticle/Chapter_130A/Article_22.rtf. Accessed Nov 11, 2008.
  • 4
    United States Government. Public Health Service Act, Title 42 US Code, Section 306.
  • 5
    National Center for Injury Prevention and Control. Data elements for emergency department systems, release 1.0. Atlanta, GA:Centers for Disease Control and Prevention, 1997.
  • 6
    Office of State and Budget Management. Population Estimates and Projections—North Carolina. Available at: http://www.demog.state.nc.us. Accessed Feb 26, 2008.
  • 7
    Daly LE. Confidence limits made easy: interval estimation using a substitution method. Am J Epidemiol. 1998; 147:783790.
  • 8
    Shah MN, Bazarian JJ, Lerner B, et al. The epidemiology of emergency medical services use by older adults: an analysis of the National Hospital Ambulatory Medical Care Survey. Acad Emerg Med. 2007; 14:441.
  • 9
    Aylin P, Bottle A, Majeed A. Use of administrative data or clinical databases as predictors of risk of death in hospital: comparison of models [abstract]. BMJ. 2007; 334:1044.
  • 10
    Waller A, Hakenewerth A, Ising A, Tintinalli A. “NC DETECT 2007 Annual Report,” in progress.
  • 11
    Centers for Disease Control and Prevention. Frequently Asked Questions. Available at: http://www.cdc.gov/nchs/about/major/ahcd/faq.htm. Accessed Nov 11, 2008, and Burt C, personal communication, Oct 8, 2008.
  • 12
    Ising A, Li M, Deyneka L, Barnett C, Scholer M, Waller A. Situational awareness using web-based annotation and custom reporting [abstract]. Adv Disease Surveil. 2007; 4:167.
  • 13
    Nawar E, Niska RW, Xu J. National Hospital Ambulatory Medical Care Survey: 2005 Emergency Department Summary. Advance Data from Vital and Health Statistics, US Department of Health and Human Services, Centers for Disease Control and Prevention, National Center for Health Statistics. #386. June 29, 2007. Table 2, footnote 1.
  • 14
    Burt C. Personal communication, Nov 2007.
  • 15
    Nawar E, Niska RW, Xu J. National Hospital Ambulatory Medical Care Survey: 2005 Emergency Department Summary. Advance Data from Vital and Health Statistics, US Department of Health and Human Services, Centers for Disease Control and Prevention, National Center for Health Statistics #386. June 29, 2007, p 8.
  • 16
    Nawar E, Niska RW, Xu J. National Hospital Ambulatory Medical Care Survey: 2005 Emergency Department Summary. Advance Data from Vital and Health Statistics, US Department of Health and Human Services, Centers for Disease Control and Prevention, National Center for Health Statistics. #386, June 29, 2007, Tables 8, 9, 10, 11.
  • 17
    Nawar E, Niska RW, Xu J. National Hospital Ambulatory Medical Care Survey: 2005 Emergency Department Summary. Advance Data from Vital and Health Statistics, US Department of Health and Human Services, Centers for Disease Control and Prevention, National Center for Health Statistics. #386, June 29, 2007, Table 14, p 22.
  • 18
    Haas S, Travers D, Tintinalli J, et al. Towards vocabulary control for chief complaint. Acad Emerg Med. 2008; 5:47682.
  • 19
    Schneider D, Appleton L, McLemore T. A reason for visit classification for ambulatory care. National Center for Health Statistics. Vital Health Stat 2. 1979; (78):ivi,163.
  • 20
    Gorelick MH, Alpern ER, Alessandrini EA. A system for grouping presenting complaints: the pediatric emergency reason for visit clusters. Acad Emerg Med. 2005; 12:8, 72331.
  • 21
    Milburn JA, Driver CP, Youngson GG, et al. The accuracy of clinical data: a comparison between central and local data collection. Surgeon. 2007; 5:2758.
  • 22
    Corrigan JD, Selassie AW, Lineberry LA, et al. Comparison of the traumatic brain injury (TBI) model systems national dataset to a population-based cohort of TBI hospitalizations. Arch Phys Med Rehabil. 2007; 88:41826.

Supporting Information

  1. Top of page
  2. Abstract
  3. Methods
  4. Results
  5. Discussion
  6. Limitations
  7. CONCLUSIONS
  8. Acknowledgments
  9. References
  10. Supporting Information

Data Supplement S1. Emergency department (ED) visits by age group and gender.

Data Supplement S2. Emergency department (ED) visits by arrival method and age group.

Data Supplement S3. Emergency department visits by expected source of payment.

Data Supplement S4. Emergency department (ED) visit disposition as percent of all ED visits.

Data Supplement S5. Top 5 National Hospital Ambulatory Medical Care Survey (NHAMCS) ad hoc cluster groups, by age group, comparing North Carolina Disease Event Tracking and Epidemiologic Collection Tool (NC DETECT) and NHAMCS.

Data Supplement S6. Emergency department visits by primary diagnosis code clusters.

Data Supplement S7. NHAMCS 2005 emergency department patient record data collection form.

Data Supplement S8. Emergency department data elements sent to NC DETECT.

Please note: Wiley Periodicals Inc. is not responsible for the content or functionality of any supporting information supplied by the authors. Any queries (other than missing material) should be directed to the corresponding author for the article.

FilenameFormatSizeDescription
ACEM_334_sm_DataSupplementS1.pdf92KSupporting info item
ACEM_334_sm_DataSupplementS2.pdf76KSupporting info item
ACEM_334_sm_DataSupplementS3.pdf5KSupporting info item
ACEM_334_sm_DataSupplementS4.pdf12KSupporting info item
ACEM_334_sm_DataSupplementS5.pdf79KSupporting info item
ACEM_334_sm_DataSupplementS6.pdf25KSupporting info item
ACEM_334_sm_DataSupplementS7.pdf64KSupporting info item
ACEM_334_sm_DataSupplementS8.pdf36KSupporting info item

Please note: Wiley Blackwell is not responsible for the content or functionality of any supporting information supplied by the authors. Any queries (other than missing content) should be directed to the corresponding author for the article.