Get access

Quality by Any Other Name?: A Comparison of Three Profiling Systems for Assessing Health Care Quality

Authors

  • Eve A. Kerr,

    1. Center for Practice Management and Outcomes Research, Veterans Affairs Ann Arbor Healthcare System, Ann Arbor, MI,
    2. Department of Internal Medicine, University of Michigan Medical School, Ann Arbor, MI,
    Search for more papers by this author
    • Address correspondence to Eve A. Kerr, M.D., M.P.H., Center for Practice Management and Outcomes Research, Veterans Affairs Ann Arbor Healthcare System, Ann Arbor, MI and the Department of Internal Medicine, University of Michigan Medical School, Ann Arbor, MI. Timothy P. Hofer, M.D., M.S., and Rodney A. Hayward, M.D., are with the Center for Practice Management and Outcomes Research, Veterans Affairs Ann Arbor Healthcare System, Ann Arbor, MI and also with the Department of Internal Medicine, University of Michigan Medical School, Ann Arbor, MI. John L. Adams, Ph.D., and Elizabeth A. McGlynn, Ph.D., are with RAND, Santa Monica, CA. Mary M. Hogan, Ph.D., R.N., is with the Center for Practice Management and Outcomes Research, Veterans Affairs Ann Arbor Healthcare System, Ann Arbor, MI. Steven M. Asch, M.D., M.P.H., is with the Veterans Affairs Greater Los Angeles Health Care System, Los Angeles, CA; and also with Department of Medicine, Geffen School of Medicine at UCLA, Los Angeles, CA and with RAND, Santa Monica, CA.

  • Timothy P. Hofer,

    1. Center for Practice Management and Outcomes Research, Veterans Affairs Ann Arbor Healthcare System, Ann Arbor, MI,
    2. Department of Internal Medicine, University of Michigan Medical School, Ann Arbor, MI,
    Search for more papers by this author
  • Rodney A. Hayward,

    1. Center for Practice Management and Outcomes Research, Veterans Affairs Ann Arbor Healthcare System, Ann Arbor, MI,
    2. Department of Internal Medicine, University of Michigan Medical School, Ann Arbor, MI,
    Search for more papers by this author
  • John L. Adams,

    1. RAND, Santa Monica, CA,
    Search for more papers by this author
  • Mary M. Hogan,

    1. Center for Practice Management and Outcomes Research, Veterans Affairs Ann Arbor Healthcare System, Ann Arbor, MI,
    Search for more papers by this author
  • Elizabeth A. McGlynn,

    1. RAND, Santa Monica, CA,
    Search for more papers by this author
  • Steven M. Asch

    1. Veterans Affairs Greater Los Angeles Health Care System, Los Angeles, CA, and
    2. Department of Medicine, Geffen School of Medicine at UCLA, Los Angeles, CA and with RAND, Santa Monica, CA.
    Search for more papers by this author

Abstract

Objective. Many performance measurement systems are designed to identify differences in the quality provided by health plans or facilities. However, we know little about whether different methods of performance measurement provide similar answers about the quality of care of health care organizations. To examine this question, we used three different measurement approaches to assess quality of care delivered in veteran affairs (VA) facilities.

Data Sources/Study Setting. Medical records for 621 patients at 26 facilities in two VA regions.

Study Design. We examined agreements in quality conclusions using: focused explicit (38 measures for six conditions/prevention), global explicit (372 measures for 26 conditions/prevention), and structured implicit review physician-rated care (a single global rating of care for three chronic conditions and overall acute, chronic and preventive care). Trained nurse abstractors and physicians reviewed all medical records. Correlations between scores from the three systems were adjusted for measurement error in each using multilevel regression models.

Results. Intercorrelations of scores were generally moderate to high across all three systems, and rose with adjustment for measurement error. Site-level correlations for prevention and diabetes care were particularly high. For example, adjusted for measurement error at the site level, prevention quality was correlated at 0.89 between the implicit and global systems, 0.67 between implicit and focused, and 0.73 between global and focused systems.

Conclusions. We found moderate to high agreement in quality scores across the three profiling systems for most clinical areas, indicating that all three were measuring a similar construct called “quality.” Adjusting for measurement error substantially enhanced our ability to identify this underlying construct.

Ancillary