Validation of a computer version of the American College of Rheumatology patient assessment questionnaire for the autonomous self-entry of self-report data in an urban rheumatology clinic


Serial collection of health surveys using standardized validated questionnaires is essential in the monitoring of treatment effectiveness and screening for impairments (1). The acquisition of patient self-report data is standard in clinical trials in rheumatology. Commonly used instruments (2) include the Health Assessment Questionnaire (HAQ) (3), the Multidimensional HAQ (MDHAQ) (4), the Clinical HAQ (5), the Arthritis Impact Measurement Scales, version 2 (6), and the Medical Outcomes Study Short Form 36 (SF-36) (7). The American College of Rheumatology (ACR) (8), the Institute of Medicine (1), and the Centers for Disease Control and Prevention (9) have recommended the use of health surveys. Survey data, however, are rarely collected during routine outpatient rheumatology clinic visits (1, 10). The instruments often are not simple or brief enough to administer; neither are they quickly and easily scored or easily interpreted so as to allow use of the data for immediate clinical decision-making. The logistics of distribution and collection, the potential for disruption of work flow, and cost have also been cited as barriers to use (2).

Questionnaires have traditionally been distributed on paper and completed with pen or pencil. Some require a computer for scoring (the SF-36, for example); most do not (2). However, to analyze groups of responses from 1 or more patients or to correlate them with other health measurements, data entry into a computer is usually necessary. The direct unaided entry by patients of self-report data into a computer system provides both cost and labor savings in the pursuit of these goals.

A database was designed using OpenBase, a relational database management system by OpenBase International (Francestown, NH). Two database client applications were written using the OpenStep developer system for Next Computers, now Apple Computers (Cupertino, CA), by (Detroit, MI): “Questionnaire” for patients to use to supply data and “Questionnaire Viewer” for the medical provider to recall, display, and print survey results. The ACR Patient Assessment (8), a derivative of the MDHAQ, was used because it is brief, validated, easy for patients to understand, sensitive to clinical change, and suitable across the full spectrum of rheumatic diseases. A hard-wired local area network was installed at the study site (the outpatient clinic of the rheumatology faculty of an urban medical school that has 10,000–11,000 annual visits). Two client computers were placed in kiosks in the public patient reception area. The human–computer interface consisted of a video monitor and mouse. All patient data entry was via mouse.

Prior Institutional Review Board approval was obtained. For 5 months, all patients older than 18 years who presented for an appointment with the lead author were invited to participate. A trained research assistant answered questions, obtained informed consent, determined the order of survey acquisition, and timed the participants. A paper or computer questionnaire was completed prior to the patient–physician encounter, and its counterpart afterwards. The order varied. The time interval between completion of the first and commencement of the second ranged from 15 to 90 minutes.

One hundred thirty patients participated. Their ages ranged from 23 to 84 years (mean ± SD 53.89 ± 13.81). There were 118 African American patients (104 female), 11 white patients (8 female), and 1 Hispanic patient (female). The primary rheumatic diseases were osteoarthritis (n = 48), systemic lupus erythematosus (n = 28), rheumatoid arthritis (n = 26), fibromyalgia (n = 10), spondylarthropathy (n = 5), polyarthralgia (n = 4), juvenile arthritis (n = 3), mixed connective tissue disease (n = 2), and dermatomyositis, gout, scleroderma, and vasculitis (1 each). All patients met ACR criteria for diagnosis, for diseases that have such diagnostic criteria.

Mean scores and standard deviations were calculated for all questions. The intraclass correlation coefficients of the paired responses in computer and paper surveys for all items ranged from 0.75 to 0.91, indicating good correlation. The kappa scores for the 13 Likert-style items ranged from 0.50 to 0.74 (P < 0.01) (Table 1), indicating good to very good agreement.

Table 1. Scores on the computer versus paper version of the American College of Rheumatology Patient Assessment*
ItemMean ± SD scoreCohen's kappaICC (95% CI)
  • *

    Items 1–13 are scored 0–3; the visual analog scales are scored 0–10. ICC = intraclass correlation coefficient; 95% CI = 95% confidence interval.

 1. Dress yourself, including tying your shoelaces and doing buttons?0.92 ± 0.790.95 ± 0.840.660.91 (0.87–0.94)
 2. Get in and out of bed?0.85 ± 0.760.88 ± 0.760.640.86 (0.81–0.90)
 3. Lift a full cup or glass to your mouth?0.48 ± 0.790.45 ± 0.700.530.75 (0.64–0.82)
 4. Walk outdoors on flat ground?1.01 ± 0.890.97 ± 0.840.500.81 (0.73–0.86)
 5. Wash and dry your entire body?0.88 ± 0.790.88 ± 0.820.630.84 (0.77–0.88)
 6. Bend down to pick up clothes?1.14 ± 0.871.18 ± 0.870.590.85 (0.78–0.89)
 7. Turn regular faucets on and off?0.63 ± 0.790.55 ± 0.710.720.83 (0.76–0.88)
 8. Get in and out of a car, bus, train, or airplane?1.25 ± 0.871.10 ± 0.780.630.87 (0.81–0.91)
 9. Walk two miles?2.09 ± 1.102.15 ± 1.020.620.89 (0.85–0.92)
10. Participate in sports and games as you like?2.16 ± 1.022.35 ± 0.940.580.82 (0.74–0.88)
11. Get a good night's sleep?1.61 ± 1.061.81 ± 1.020.630.89 (0.84–0.93)
12. Deal with feelings of anxiety or being nervous?1.13 ± 0.821.13 ± 0.880.630.88 (0.83–0.92)
13. Deal with feelings of depression or feeling blue?1.08 ± 0.911.07 ± 0.890.740.88 (0.83–0.92)
Pain assessment (visual analog scale)6.46 ± 2.956.28 ± 2.57 0.83 (0.76–0.88)
Global assessment (visual analog scale)5.85 ± 3.255.59 ± 2.90 0.83 (0.76–0.88)

This study showed the computer version of the ACR Patient Assessment to be comparable with its paper counterpart in this population, based on the close agreement between answers. The kappa scores compare favorably with those of the validation study of the MDHAQ (i.e., 0.65–0.81) (4). Direct computer entry by patients enhances data quality by disallowing missing, ambiguous, and double answers and by precluding the transcription errors that may occur when transferring information from paper to computer for analysis. The results can be immediately scored, displayed, and printed. Trend lines and profiles may be graphed to assist in interpretation and clinical decision-making at the point of care. The data may be incorporated into an electronic medical record and are available for secondary uses such as the compilation of outcomes data and the evaluation of the effectiveness of treatments. Our results demonstrate that it is feasible to attain reliable, autonomous computer self-entry of self-report data in an urban rheumatology clinic.