1. Top of page
  2. Abstract


To identify characteristics that predict a valid rheumatoid arthritis (RA) or juvenile idiopathic arthritis (JIA) diagnosis among RA- and JIA-coded individuals in the General Practice Research Database (GPRD), and to assess limitations of this type of diagnostic validation.


Four RA and 2 JIA diagnostic groups were created with differing strengths of evidence of RA/JIA (Group 1 = strongest evidence), based on RA/JIA medical codes. Individuals were sampled from each group and clinical and prescription data were extracted from anonymized hospital/practice correspondence and electronic records. American College of Rheumatology and International League of Associations for Rheumatology diagnostic criteria were used to validate diagnoses. A data-derived diagnostic algorithm that maximized sensitivity and specificity was identified using logistic regression.


Among 223 RA-coded individuals, the diagnostic algorithm classified individuals as having RA if they had an appropriate GPRD disease-modifying antirheumatic drug prescription or 3 other GPRD characteristics: >1 RA code during followup, RA diagnostic Group 1 or 2, and no later alternative diagnostic code. This algorithm had >80% sensitivity and specificity when applied to a test data set. Among 101 JIA-coded individuals, the strongest predictor of a valid diagnosis was a Group 1 diagnostic code (>90% sensitivity and specificity).


Validity of an RA diagnosis among RA-coded GPRD individuals appears high for patients with specific characteristics. The findings are important for both interpreting results of published GPRD studies and identifying RA/JIA patients for future GPRD-based research. However, several limitations were identified, and further debate is needed on how best to validate chronic disease diagnoses in the GPRD.


  1. Top of page
  2. Abstract

The General Practice Research Database (GPRD) is an electronic database of primary care medical records for approximately 5% of the UK population. In addition to its large size (46 million person-years of anonymized data), its strengths include provision of comprehensive longitudinal prescribing and diagnostic information for all patients registered with participating practices, and generalizability: more than 98% of the UK population is registered with a general practitioner (GP), and GPRD practices are broadly representative of all practices in the UK (1). Patient data in the database include outcomes of practice consultations, hospitalizations and specialist referrals, key test results, and prescriptions. Practices are included in the GPRD only after they meet a range of data quality criteria and undergo ongoing evaluations (2).

The GPRD has proved invaluable for studying relatively uncommon rheumatic conditions, providing a statistically powerful data source that complements smaller, more expensive population-based studies. The database is used to investigate the epidemiology, natural history, and treatments for a range of rheumatic diseases (3–9). However, data are sparse on the diagnostic validity of arthritic disorders recorded in general practice.

Data in the GPRD are coded. Therefore, validations of GPRD diagnoses first entail compiling a list of relevant medical with/without disease-appropriate therapeutic codes. Patients with these codes are sampled and the diagnosis is validated against a gold standard, such as evidence that a hospital specialist made the diagnosis or that specific diagnostic procedures were performed. External gold standard data are usually purchased from the practices where the patients are registered; questionnaires can be sent to GPs to ascertain specific diagnostic details or copies of anonymized hospital correspondence and practice notes can be acquired. Obtaining external data is expensive; therefore, many validations have small sample sizes. Also, some validations confine patient sampling to highly selected practices and patients.

We carried out a large (n = 400) external validation exercise of GPRD diagnoses for 2 specific rheumatic disorders: rheumatoid arthritis (RA) and juvenile idiopathic arthritis (JIA). Our aims were to 1) select patients with medical codes that represented the range of RA/JIA codes in the GPRD, 2) identify GPRD characteristics among individuals with these codes that predicted a valid RA/JIA diagnosis, and 3) use information on patient selection and practice participation to assess the limitations of this type of validation in the GPRD.


  1. Top of page
  2. Abstract

Categorizing RA and JIA patients.

All relevant medical RA and JIA codes were identified from the GPRD coding dictionary by 3 of the authors (AJH, SLT, LS) and verified by a professor of rheumatology (CC). We ascertained patients with an RA or JIA code recorded between 1987 and 2002 during their followup (the time they were registered with a practice that was contributing data to the GPRD). These patients were subdivided into those age ≥16 or <16 years at the time of their first code (potential RA and JIA cases, respectively).

For potential RA cases, each RA medical code recorded during followup was categorized into 1 of 4 diagnostic groups, with Group 1 having the strongest evidence of RA and Group 4 having the weakest. Group 1 included codes for seropositive or erosive RA, Group 2 comprised “rheumatoid arthritis” codes (such as RA of knee), Group 3 codes were for systemic manifestations of RA (e.g., myopathy due to RA), and Group 4 codes were for seronegative RA or other weak evidence for RA. Individuals were categorized according to the highest RA group in their GPRD record; for example, an individual with a mixture of Group 1 and Group 3 RA codes was classified in Group 1. In total, 31,830 individuals had a first RA code at age ≥16 years (Group 1 = 1,998, Group 2 = 25,740, Group 3 = 525, Group 4 = 3,567).

For JIA cases, individuals age <16 years at the time of their first code were categorized into 2 groups. Those in Group 1 had codes that matched most closely the general definition or any of the 7 subtypes of the 2001 International League of Associations for Rheumatology (ILAR) revised JIA classification, and/or codes that GPs were likely to choose to represent these subtypes; for example, pauciarticular-onset juvenile chronic arthritis, RA, or psoriatic arthritis (10). However, other young GPRD patients had nonspecific arthritis codes such as juvenile polyarthritis or monarthritis, with no subsequent diagnostic code to explain this arthritis. Because these codes could also represent JIA, we categorized these patients into JIA Group 2. After categorization, 2,901 individuals were identified with a JIA code at age <16 years (Group 1 = 868, Group 2 = 2,033). Both RA and JIA code lists are available from the authors.

Patient sampling.

Our research funds allowed us to include in the study 300 RA-coded patients (80 each from Groups 1 and 4, 120 from Group 2, 20 from Group 3) and 100 JIA-coded patients (30 from Group 1, 70 from Group 2). We first excluded individuals with no available external data because they had died, left the practice, or were registered with a practice that does not take part in research studies. Only 253 (34%) GPRD practices were listed as willing to provide copies of patient correspondence at the time of our study. We then derived a random sample from each group with an approximate 20% excess to accommodate possible additional losses after the end of study followup in 2002.

Obtaining patient correspondence.

Additional Information Services (AIS), the company that acts as an intermediary between GPRD practices and researchers, telephoned practices to check that the selected patients were still registered and that the practice was willing to participate. A standardized letter was then sent to each practice, requesting all (anonymized) correspondence and test results in the patient notes relating to the patient's arthritis.

Applying diagnostic criteria.

We used the 1987 American College of Rheumatology (ACR; formerly the American Rheumatism Association) revised RA classification criteria (both the 4 of 7 and classification tree formats) with the MacGregor et al modification, which increased diagnostic sensitivity by incorporating lifetime evidence of disease (11, 12). For JIA, we used the 2001 ILAR criteria (10). We identified factors that were likely to be recorded accurately in the database and/or were relevant to diagnostic accuracy; these included patient characteristics, RA/JIA diagnostic group, number of RA/JIA codes and relevant symptom/investigation codes in their GPRD record, evidence of each element of the ACR/ILAR criteria, timing of arthritis drug prescriptions, and alternative diagnoses for RA/JIA suggested in patients' electronic records or written correspondence. Two authors (AJH and SLT) independently extracted these data from the correspondence/computerized records onto a prepiloted case validation form. We applied the ACR/ILAR criteria, categorizing each patient as either 1) fulfilling the criteria, 2) not fulfilling the criteria, 3) indeterminate (unclear whether the criteria were fulfilled), or 4) having insufficient data to apply the criteria. Results were compared, but in order to maximize accuracy we referred all records with disagreement or uncertainty about the diagnosis to a consultant rheumatologist (CJE) for a third opinion. We also referred records where the criteria were fulfilled but an alternative diagnosis was proposed in hospital correspondence, or where criteria were not fulfilled but a diagnosis of RA/JIA was suggested in a rheumatology clinic. Referred cases were categorized (CJE) as having definite, probable, possible, or unlikely RA/JIA, or having insufficient data to make a diagnosis.

Statistical analysis.

Data were analyzed using Stata, version 9 (Stata Corporation, College Station, TX). Individuals with possible RA/JIA or with insufficient data were excluded from main analyses. Remaining patients were categorized as having RA/JIA if they fulfilled the ACR/ILAR criteria on initial assessment and this decision was not overruled (by CJE), or if they had definite or probable RA/JIA on expert opinion. Similarly, individuals were classified as not having RA/JIA if they did not fulfill the diagnostic criteria and this decision was not later overruled, if they were classified as having unlikely RA/JIA on expert opinion, or if their GP reported that the patient had either been wrongly coded as having RA/JIA or had no mention of RA/JIA anywhere in their full medical record.

We compared GPRD characteristics of individuals with confirmed RA/JIA with those without RA/JIA, using cross-tabulations to assess the accuracy of each characteristic in predicting a valid diagnosis; both sensitivity (the proportion of individuals with a valid diagnosis who had the characteristic) and specificity (the proportion of individuals without a valid diagnosis who did not have the characteristic). Because each characteristic formed a categorical variable, we used logistic regression to identify discriminant functions (13). Univariable odds ratios (ORs) were obtained for each characteristic using the validated RA diagnosis as the outcome. A multivariable logistic model was set up to develop a data-derived diagnostic algorithm, i.e., the combination of GPRD characteristics that best predicted a valid diagnosis. Characteristics with an OR ≥2.0 and/or a P value less than 0.1 on univariable analysis were added sequentially to the model and kept if they retained an association with RA. The log ORs for GPRD characteristics in the final model were used to derive an RA score for each individual by assigning to the individual the value of the log OR for each characteristic present in their GPRD record, and then summing these values to produce the individual's overall score (13). We determined the cutoff score that best separated individuals without a valid diagnosis (scores below the cutoff) from those with a valid diagnosis (scores above the cutoff). All possible cutoffs were examined to identify the cutoff that maximized diagnostic sensitivity and specificity. The validity of the algorithm was first examined in the whole sample, then repeated by randomly splitting the data set in half, recreating the algorithm in one half and testing it in the second half.

Ethics approval.

Approval for the study was obtained from the Scientific and Ethical Advisory Group of the GPRD.


  1. Top of page
  2. Abstract

Figure 1 shows the sampling of eligible individuals with RA or JIA codes. In total, 18,771 (54%) patients were registered with practices that did not participate in external validations of this type. An additional 4,471 patients left their GPRD practice or died during followup. We randomly sampled 479 patients from the remaining individuals; AIS identified 11 of these as having transferred out or died since 2002. The final 468 patients (349 RA, 119 JIA) came from 179 practices with a median of 2 patients (range 1–22) per practice.

thumbnail image

Figure 1. Sampling of eligible patients from the rheumatoid arthritis (RA) and juvenile idiopathic arthritis (JIA) data sets. 1RA: 1,998 in Group 1, 25,740 in Group 2, 525 in Group 3, 3,567 in Group 4, JIA: 868 in Group 1, 2,033 in Group 2; 2RA: 102 in Group 1, 138 in Group 2, 27 in Group 3, 92 in Group 4, JIA: 36 in Group 1, 84 in Group 2; 3RA: 74 in Group 1, 109 in Group 2, 10 in Group 3, 61 in Group 4, JIA: 30 in Group 1, 64 in Group 2; 4RA: 105 in Group 1, 124 in Group 2, 18 in Group 3, 72 in Group 4, JIA: 36 in Group 1, 65 in Group 2; GPRD = General Practice Research Database; AIS = Additional Information Services; GP = general practitioner.

Download figure to PowerPoint

On initial contact with AIS, 146 practices reported that they could help with 348 (74%) patients, either with all requests (98 practices) or with some requests (48 practices). Twenty-four practices could not help with any patients and 9 did not respond to repeated requests. AIS did not routinely report why practices could not help. To achieve our target sample size, we randomly sampled replacement patients from the practices who said that they could help with at least 1 patient. An additional 72 patients were added to the original 348 patients, totaling 420 patients, including 319 RA patients (71% female) and 101 JIA patients (57% female) with a median age at diagnosis of 51 years (interquartile range [IQR] 42–61) and 7 years (IQR 4–10.5), respectively.

Figure 2 shows subsequent responses from practices for these 420 patients. Despite initial agreement, practices for 71 (17%) patients (61 RA, 10 JIA) did not participate: practices for 49 patients refused to provide data, 7 patients had left the practice or died, 7 patients could not be identified (see Figure 2 legend for detailed explanation), and practices for 8 patients did not respond.

thumbnail image

Figure 2. Receipt of correspondence and results of validation of rheumatoid arthritis (RA) and juvenile idiopathic arthritis (JIA) diagnoses. ** = Patient's notes lost (n = 1 RA + 1 JIA), cannot identify patient (n = 2 RA), GP left practice (n = 3 RA); GP = general practitioner; ACR = American College of Rheumatology; ILAR = International League of Associations for Rheumatology.

Download figure to PowerPoint

Practices sent data for 323 patients and reported that for a further 26 patients (11 RA, 15 JIA), either the patient had been miscoded and did not have RA/JIA or there was no mention of RA/JIA anywhere in the patient's written records. Among the patients with data, 118 fulfilled and 102 did not fulfill the ACR/ILAR diagnostic criteria, classification was unclear for 47, and opinion was divided for 38 (Figure 2). The remaining 18 patients had insufficient data to apply the criteria; these had limited GP notes and no hospital referrals or had only recent hospital data because practices had not kept older correspondence.

We sent 97 (30%) sets of notes for further opinion. After combining results from applying the criteria, expert opinion, and GP feedback about wrongly coded patients, 125 (48%) of the 258 RA-coded patients and 26 (29%) of the 91 JIA-coded patients were classified as having RA/JIA, 99 and 56 were classified as not having RA/JIA, 13 and 2 were classified as having possible RA/JIA, and 21 and 7 were unclassifiable, respectively. The most common true diagnosis for those without RA was osteoarthritis; other diagnoses included psoriatic arthritis, polymyalgia rheumatica, cervical spondylitis, epicondylitis, rotator cuff syndrome, and (among those with seronegative RA codes) nonspecific arthralgia with a negative rheumatoid factor test. Among Group 2 JIA-coded patients, the most common alternative diagnosis was irritable hip.

RA analyses.

The validity of specific GPRD characteristics for an RA diagnosis are shown in Table 1 (code lists available from the authors). Six characteristics had ORs ≥2.0 on univariable analysis. Of these, having >1 RA code during followup had high sensitivity and specificity (≥80%). High specificity but low sensitivity was demonstrated for 2 other variables: 1) having a disease-modifying antirheumatic drug (DMARD) prescription after the first RA GPRD code with no alternative indication for that DMARD in the previous 5 years, and 2) having an oral steroid prescription after the first RA code. Three variables (being in RA diagnostic Group 1 or 2, having 2 nonsteroidal antiinflammatory drug [NSAID] prescriptions within a 6-month period, and having no alternative diagnosis for RA in the GPRD record after the last RA code) all had high sensitivity but poor specificity. Three variables had a weaker association with a valid RA diagnosis (Table 1); age and sex were not associated with a valid diagnosis (data not shown).

Table 1. Comparison of characteristics of individuals with a valid and invalid RA diagnosis (n = 224)*
 RA (n = 125)Not RA (n = 99)Sensitivity, %Specificity, %OR (95% CI)P
  • *

    RA = rheumatoid arthritis; OR = odds ratio; 95% CI = 95% confidence interval; NSAID = nonsteroidal antiinflammatory drug; GPRD = General Practice Research Database; DMARD = disease-modifying antirheumatic drug.

  • Prescription data missing for 1 individual with an invalid diagnosis; analyses for DMARDs, steroids, and NSAIDs based on 223 individuals.

  • Four levels of strength of evidence based on GPRD codes, where 1 = strongest evidence and 4 = weakest evidence (see Materials and Methods for detailed explanation).

  • §

    Prescriptions after first RA code with no medical code for an alternative indication for the specific DMARD in the 5 years before the first prescription date.

  • Prescriptions after first RA code.

>1 RA code (on different dates)      
 Yes10019808116.84 (8.66–32.75)< 0.001
 No2580  1.00 
RA diagnostic group      
 1 or 211650934912.63 (5.77–27.67)< 0.001
 3 or 4949  1.00 
No. of joint symptom/ investigation codes after  first RA code      
 ≥2875570441.83 (1.06–3.17)0.030
 <23844  1.00 
2 NSAID prescriptions in GPRD  within 6 months of each  other      
 Yes1167293274.65 (2.06–10.49)< 0.001
 No926  1.00 
≥1 DMARD prescription in  GPRD record with no prior  alternative indication for  the DMARD§      
 Yes974789681.41 (27.50–241.02)< 0.001
 No2894  1.00 
≥1 oral steroid prescription in  GPRD      
 Yes461837822.59 (1.38–4.85)0.002
 No7980  1.00 
≥1 steroid injection in GPRD      
 Yes261221861.88 (0.90–3.95)0.092
 No9986  1.00 
Alternative diagnosis in GPRD  record before first RA code?      
 No987679231.10 (0.58–2.07)0.771
 Yes2623  1.00 
Alternative diagnosis in GPRD  record after last RA code?      
 No1075886404.20 (2.22–7.97)< 0.001
 Yes1841  1.00 

Results of multivariable analyses of RA-coded data are shown in Table 2. Four characteristics remained associated with a valid RA diagnosis in the final model. The log OR for an appropriate DMARD prescription was more than twice that of any other characteristic, and 87% sensitivity (95% confidence interval [95% CI] 80–93%) and 88% specificity (95% CI 80–94%) were obtained with a cutoff score (4.01) just below this log OR. Using this cutoff and the log ORs for the 4 variables, we obtained a diagnostic algorithm, whereby an individual would be classified as having RA (score >4.01) if he or she had either an appropriate GPRD DMARD prescription or no appropriate DMARD prescription but all 3 other GPRD characteristics (being in RA Group 1 or 2, having no alternative GPRD diagnosis for RA after the last RA code, and having >1 RA code during followup).

Table 2. Multivariable analyses of the whole data set and test data set of individuals with valid and invalid rheumatoid arthritis (RA) diagnoses*
 Whole validation data set (n = 223)Test data set (n = 112)
Adjusted OR (95% CI)§Log ORPAdjusted OR (95% CI)§Log ORP
  • *

    OR = odds ratio; 95% CI = 95% confidence interval; DMARD = disease-modifying antirheumatic drug; GPRD = General Practice Research Database.

  • One individual with missing therapy data excluded from analyses.

  • Randomly selected 50% of individuals with confirmed and unlikely RA (see Materials and Methods for detailed explanation).

  • §

    Adjusted for other variables in the table.

  • The log ORs for each GPRD characteristic in the model were used to obtain an individual's score. For example, an individual in RA diagnostic Group 1 who had 3 RA codes during followup, no alternative diagnosis in their GPRD record after their last RA code, and no DMARD prescription would have a total score of (1.85 + 0.95 + 1.64) 4.44. This is greater than the cutoff score of 4.01, so the individual would be classified as having RA.

  • #

    Four levels of strength of evidence based on GPRD codes, where 1 = strongest evidence and 4 = weakest evidence.

  • **

    Prescriptions after first RA code with no medical code for an alternative indication for a specific DMARD in the 5 years before the first prescription date.

>1 RA code (on different dates)      
 Yes5.18 (2.06–12.98)1.64< 0.0014.15 (1.18–14.66)1.420.027
 No1.00  1.00  
RA diagnostic group#      
 1 or 26.33 (1.84–21.72)1.850.0034.21 (0.93–19.12)1.440.063
 3 or 41.00  1.00  
≥1 DMARD prescription in GPRD record with no prior alternative indication for the DMARD**      
 Yes55.53 (15.15–203.55)4.02< 0.00153.91 (10.23–283.99)3.99< 0.001
 No1.00  1.00  
Alternative diagnosis on computer after last RA code?      
 No2.58 (0.94–7.11)0.950.0662.23 (0.54–9.29)0.800.270
 Yes1.00  1.00  

After randomly splitting the data in half, recreating the model and cutoff in one half and testing it in the other half, the same 4 variables remained in the final model (Table 2). The same diagnostic algorithm had a sensitivity of 84% (95% CI 73–94%) and a specificity of 86% (95% CI 72–92%) for a valid RA diagnosis when tested on the other half of the data.

We investigated alternative algorithms. Dropping the least strongly associated variable (an alternative diagnosis to RA) from the model gave an algorithm with lower specificity (79%). Using a simpler DMARD characteristic (dropping the requirement for no prior alternative indication for the DMARD) gave an algorithm of similar validity to the original (87% sensitivity, 86% specificity). A higher cutoff score of 4.03 gave an algorithm with higher specificity (91%), classifying individuals as having RA if they had either a DMARD code and any 1 of the 3 other characteristics or all 3 other characteristics; however, this algorithm could not be recreated in the test data set because the combined score from the 3 non-DMARD characteristics was less than the score for the DMARD variable (Table 2).

JIA analyses.

The major predictor of a valid JIA diagnosis was the JIA diagnostic group. In total, 25 of 26 of those classified as having JIA were in JIA Group 1 (96% sensitivity) compared with 4 of 56 of those with unlikely JIA (93% specificity); only 1 of the 53 children in Group 2 was classified as having JIA. Remaining analyses therefore focused on the 29 children in diagnostic Group 1 (Table 3). Cross-tabulations produced many zero cells and therefore, ORs were not estimated. The strongest associations with a valid JIA diagnosis were having 2 NSAID prescriptions within 6 months and having >1 JIA or other arthritis code during followup.

Table 3. Comparison of characteristics of individuals with a valid and invalid JIA diagnosis among patients with a Group 1 JIA code*
 JIA Group 1 cases (n = 29)Sensitivity, %Specificity, %
JIA (n = 25)Not JIA (n = 4)P
  • *

    JIA = juvenile idiopathic arthritis; NSAID = nonsteroidal antiinflammatory drug; GPRD = General Practice Research Database; DMARD = disease-modifying antirheumatic drug. Patients had a GPRD code consistent with the 2001 International League of Associations for Rheumatology revised classification of JIA.

  • Fisher's exact test.

  • Prescriptions after first JIA code with no medical code for an alternative indication for specific DMARD in the 5 years before the first prescription date.

  • §

    Prescriptions after first JIA code.

>1 specific JIA code (on different dates)     
 Yes180 72100
>1 specific JIA code and/or nonspecific  arthritis code     
 Yes210 84100
2 NSAID prescriptions in GPRD within 6  months of each other     
 Yes240 96100
 No14< 0.001  
≥1 DMARD prescription in GPRD record  with no prior alternative indication  for the DMARD     
 Yes100 40100
≥1 oral steroid prescription in GPRD§     
 Yes40 16100
≥1 steroid injection in GPRD§     
 Yes20 8100
Alternative diagnosis in GPRD record  before first JIA code?     
 No223 8825
Alternative diagnosis in GPRD record  after last JIA code?     
 No254 1000

Application to all RA/JIA coded individuals.

Among the 31,830 RA-coded patients age ≥16 years in the original GPRD data set, 15,746 (49%) had an appropriate DMARD prescription, 27,738 (87%) were in RA Groups 1 or 2, 16,300 (51%) had >1 RA code during followup, and 27,184 (85%) had no alternative diagnostic code after their last RA code. In total, 19,492 (61%) RA-coded patients fulfilled the diagnostic algorithm. For JIA, there were 868 individuals with a Group 1 JIA code.

Comparison of participants and nonparticipants.

We compared GPRD characteristics of the 224 RA patients categorized as having a valid or invalid RA diagnosis with the 34 possible and unclassifiable patients and the remaining 150 RA-coded patients from the original 414 selected (excluding the 6 known to have left or died). The groups were similar with respect to most characteristics (Table 4). Those who did not participate were slightly older at the time of their first RA code. The proportion of patients with a GPRD DMARD prescription was substantially lower in the possible/unclassifiable group compared with those included in algorithm analyses, and was slightly higher in the nonparticipant group.

Table 4. Characteristics of RA patients who participated and who did not participate in the validation study*
CharacteristicParticipated in validation studyInvited but did not participate (n = 150)P
Included in analyses (n = 224)Possible or unclassifiable diagnosis (n = 34)
  • *

    Values are the median (interquartile range) unless otherwise specified. RA = rheumatoid arthritis; DMARD = disease-modifying antirheumatic drug.

  • Excludes 6 patients reported to have died or left the practice.

  • Comparing the 258 individuals in the validation study with the 150 who did not participate. Pearson's chi-square test was used for categorical variables; a nonparametric K-sample test was used on equality of medians for continuous data.

  • §

    Prescription after first RA/juvenile idiopathic arthritis code with no alternative indication for that DMARD in the previous 5 years.

Age on 01/01/2003, years62.5 (53.8–72.5)58.5 (49.0–68.0)65.0 (56.5–75.5)0.117
Age at first RA code, years53.0 (43.0–62.4)47.6 (40.2–55.8)56.0 (46.4–66.6)0.080
Women, %6879700.941
Duration of RA on 01/01/2003, years9.1 (8.1–10.0)9.7 (8.1–11.0)9.0 (7.8–9.7)0.787
RA diagnostic Group 1 or 2, %7358660.272
DMARD prescription, %§4618520.061


  1. Top of page
  2. Abstract

Most validations of GPRD diagnoses assess the proportion of patients with a predefined set of medical and/or therapeutic codes with a valid diagnosis; i.e., the positive predictive value (PPV) of these codes. Often, sampling is from patients selected for a specific study, further restricting validation to those who fulfill the study's inclusion criteria. Unlike sensitivity and specificity, the PPV varies with the prevalence of the condition; the prevalence of RA (and thus the PPV) among RA-coded individuals may be higher in subpopulations with specific comorbidities or prescriptions. Therefore, results may not be generalizable to the wider GPRD population.

In contrast, our aim was to identify characteristics in GPRD electronic records that were associated with a valid RA/JIA diagnosis among individuals with any one of a wide range of RA or JIA codes. We used these characteristics to develop a diagnostic algorithm with optimum validity for an RA diagnosis. Applying established diagnostic gold standards for RA and JIA to patient medical data allowed us to assess the diagnosis independently from GP opinion.

To our knowledge, the only previous external validation of GPRD RA diagnoses was carried out in a study of the association between naproxen and thromboembolic events among RA patients (4). These patients were age 40–79 years with an RA code, a prescription for an NSAID, DMARD, or systemic steroid, and without specific comorbid conditions. A questionnaire was sent to GPs to verify the RA diagnosis for different subgroups of patients. Details of the questionnaire and practice response rates were not reported; the PPV of the RA code combination varied from 74–80%. This RA case definition could have limited specificity; our study (using stricter criteria for timing of prescriptions) found 25% specificity for RA-coded individuals with any of these prescriptions.

In JIA analyses, a Group 1 code was the major predictor of a valid diagnosis, having high sensitivity and specificity. Our search for possible JIA cases among those with other unexplained arthritis codes identified only one individual with probable JIA who had a single “arthritis” code. This suggests that JIA cases are uncommon among those without specific JIA codes, but may occur. Further characterization of a valid JIA diagnosis among those with specific codes was limited by small numbers.

It is difficult to gauge the generalizability of GPRD validation results without sufficient details of study methodology. An explicit aim of our study was to provide information on selection processes and practice response rates to assess limitations of this type of external validation. We could not randomly select patients because only a third of practices were listed as willing to participate in external validations. Among selected practices, 45% could not help with one or more patients, and those who initially agreed subsequently did not provide data for an additional 57 patients. Some nonparticipation may have occurred because patients died or left the practices, but we did not have data to quantify this. A few practices reported that they would not photocopy extensive hospital correspondence for the remuneration offered. Among participants, we could not apply diagnostic criteria for some patients with longstanding disease because practices retained only recent correspondence.

Incomplete participation may have affected our findings in various ways. Accuracy of diagnosis and coding may differ between practices that agree or refuse to participate in research studies. Patients excluded because they died or had large case files are likely to have included those with longstanding and/or severe RA disease, with probable good evidence of the diagnosis and GPRD characteristics similar to those included in the diagnostic algorithm (e.g., an appropriate DMARD prescription). This, together with oversampling of individuals with Group 3 and 4 RA codes to assess their validity and the probable lower proportion of hospital-diagnosed RA cases compared with JIA cases, could explain the relatively low proportion of valid RA diagnoses in the validation sample. In contrast, those with insufficient data comprised those with missing hospital correspondence and those with no hospital referrals; these are likely to include both valid and invalid diagnoses. It is unclear whether misuse of particular RA codes in this study, such as seronegative RA codes for patients with arthralgia and a negative rheumatoid factor test, is found throughout GPRD practices.

Our planned sample size of 400, although large compared with many previous GPRD validations, was restricted by available research funds. Acquiring a set of patient notes is costly: a nonrefundable £70. Therefore, we focused on detailing characteristics of a valid RA diagnosis among RA-coded patients and did not assess the negative predictive value of RA diagnoses; i.e., the proportion of those without an RA code who did not have RA. This presents a formidable challenge, requiring sampling from the very large number of GPRD patients with codes for other arthritic conditions or symptoms. Few validation studies have attempted this aspect of diagnostic validity.

Our findings can be used to interpret results of published RA GPRD studies and the diagnostic algorithm can be used or adapted to select RA patients for future GPRD-based research. Our study period finished 2 years before biologic therapies were approved for widespread use in the UK. Later use of these agents may influence our algorithm, but its use prior to 2002 appears to be valid. Reappraisal of the algorithm incorporating biologic therapies will need to wait for adequate followup and accumulation of sufficient subjects in the GPRD cohort. Our methods could also be applied to investigate diagnostic validity of other arthritic disorders in the GPRD. However, our study has highlighted several limitations of this type of external validation, and further debate is needed on how best to validate chronic disease diagnoses in general practice data.


  1. Top of page
  2. Abstract

Dr. Thomas had full access to all of the anonymized data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.

Study design. Thomas, Smeeth, Cooper, Hall.

Acquisition of data. Thomas, Hall.

Analysis and interpretation of data. Thomas, Edwards, Smeeth, Cooper, Hall.

Manuscript preparation. Thomas, Edwards, Smeeth, Cooper, Hall.

Statistical analysis. Thomas.

Validated RA/JIA diagnoses in the patient sample. Thomas, Edwards, Hall.


  1. Top of page
  2. Abstract
  • 1
    Office for National Statistics. Key health statistics from general practice 1996: series MB6 No1. London: Office for National Statistics; 1998.
  • 2
    Walley T, Mantgani A. The UK General Practice Research Database. Lancet 1997; 350: 10979.
  • 3
    Edwards CJ, Arden NK, Fisher D, Saperia JC, Reading I, Van Staa TP, et al. The changing use of disease-modifying anti-rheumatic drugs in individuals with rheumatoid arthritis from the United Kingdom General Practice Research Database. Rheumatology (Oxford) 2005; 44: 13948.
  • 4
    Watson DJ, Rhodes T, Cai B, Guess HA. Lower risk of thromboembolic cardiovascular events with naproxen among patients with rheumatoid arthritis [published erratum appears in Arch Intern Med 2002;162:1779]. Arch Intern Med 2002; 162: 110510.
  • 5
    Somers EC, Thomas SL, Smeeth L, Schoonen WM, Hall AJ. Incidence of systemic lupus erythematosus in the United Kingdom, 1990–1999. Arthritis Rheum 2007; 57: 6128.
  • 6
    Mikuls TR, Farrar JT, Bilker WB, Fernandes S, Schumacher HR Jr, Saag KG. Gout epidemiology: results from the UK General Practice Research Database, 1990–1999. Ann Rheum Dis 2005; 64: 26772.
  • 7
    Hubbard R, Venn A. The impact of coexisting connective tissue disease on survival in patients with fibrosing alveolitis. Rheumatology (Oxford) 2002; 41: 6769.
  • 8
    De Vries F, Bracke M, Leufkens HG, Lammers JW, Cooper C, van Staa TP. Fracture risk with intermittent high-dose oral glucocorticoid therapy. Arthritis Rheum 2007; 56: 20814.
  • 9
    Burnham JM, Shults J, Weinstein R, Lewis JD, Leonard MB. Childhood onset arthritis is associated with an increased risk of fracture: a population based study using the General Practice Research Database. Ann Rheum Dis 2006; 65: 10749.
  • 10
    Petty RE, Southwood TR, Manners P, Baum J, Glass DN, Goldenberg J, et al, for the International League of Associations for Rheumatology. International League of Associations for Rheumatology classification of juvenile idiopathic arthritis: second revision, Edmonton, 2001. J Rheumatol 2004; 31: 3902.
  • 11
    Arnett FC, Edworthy SM, Bloch DA, McShane DJ, Fries JF, Cooper NS, et al. The American Rheumatism Association 1987 revised criteria for the classification of rheumatoid arthritis. Arthritis Rheum 1988; 31: 31524.
  • 12
    MacGregor AJ, Bamber S, Silman AJ. A comparison of the performance of different methods of disease classification for rheumatoid arthritis: results of an analysis from a nationwide twin study. J Rheumatol 1994; 21: 14206.
  • 13
    Quigley MA, Chandramohan D, Rodrigues LC. Diagnostic accuracy of physician review, expert algorithms and data-derived algorithms in adult verbal autopsies. Int J Epidemiol 1999; 28: 10817.