L. D. Buccini1,2 and J. D. Schold2,3,*
Letter to the Editor
Considerations of Reliability and Validity of Transplant Center Report Cards
Article first published online: 3 DEC 2013
© Copyright 2013 The American Society of Transplantation and the American Society of Transplant Surgeons
American Journal of Transplantation
Volume 14, Issue 1, pages 239–240, January 2014
How to Cite
Buccini, L. D. and Schold, J. D. (2014), Considerations of Reliability and Validity of Transplant Center Report Cards. American Journal of Transplantation, 14: 239–240. doi: 10.1111/ajt.12548
- Issue published online: 19 DEC 2013
- Article first published online: 3 DEC 2013
To the Editor
We would like to express our appreciation to Dr. Kaplan for his interest and thoughtful Letter to the Editor which importantly points out the need for precision in rhetoric regarding quality of care and its relationship to outcome measures . Kaplan's letter draws attention to the importance of terminology as it relates to “quality” based on findings from our prior study, which are particularly salient in the current regulatory environment including heightened oversight of transplant center performance and increased ramifications of Scientific Registry of Transplant Recipients (SRTR) report cards .
To clarify, the term quality, as utilized in the original manuscript, refers to how it is currently measured by the SRTR Program-Specific Reports. Based on this definition, our study found that report cards are consistent indicators of measured quality from a given reporting period for the next cohort of patients (i.e. report cards consistently predict future performance). We did not however draw the conclusion, and carefully pointed out, that report cards cannot be assumed to accurately measure quality of care. This distinction, although it may seem nuanced, draws attention to the important issue of reliability versus validity. Although related, they deal with different aspects of measurement. While reliability is concerned with the degree to which results are stable and consistent, validity is concerned with the degree to which a metric or construct measures what it intends to measure. The aim of our study was to evaluate the reliability of measured quality of SRTR Program-Specific Reports which may or may not always be synonymous with actual quality of care (validity) as further depicted in Figure 1.
Quality of care, as Kaplan states, is a construct that is difficult to define and hence calls into question the validity of quality of care metrics. In fact, there are existing studies which demonstrate certain biases (specifically selection bias) in the way in which quality of care of transplant centers is currently measured [3, 4]. Moreover, it is not clear that our current endpoints for measuring quality are the most appropriate metrics. Other indicators such as quality of life, cost-effectiveness, complication rates and longer-term survival, may be equally important.
Thus, we fully agree with both Dr. Kaplan and Dr. Hippen (in his prior editorial) that there is a need to continue to understand and improve the validity of quality of care evaluations and to be cautious with inferring that centers with low measured quality necessarily have low quality of care [1, 5]. It should be noted, however, that understanding the reliability of report cards in and of itself is particularly important in the current environment. Policy decisions which can have significant financial ramifications are based on established risk-adjustment models of quality. Therefore it is of tremendous value to a center to be able to consistently predict with precision whether or not their center is at risk of being “flagged” for poor “measured quality.” Although quality measurement is not a perfect science, understanding the relative strengths and weaknesses of these measures is critically important especially in the current environment.
The authors of this manuscript have no conflicts of interest to disclose as described by the American Journal of Transplantation.
1 Digestive Disease Institute, Cleveland Clinic, Cleveland, OH
2 Quantitative Health Sciences, Cleveland Clinic, Cleveland, OH
3 Cleveland Clinic Lerner College of Medicine, Case Western Reserve University, Cleveland, OH
* Corresponding author: Jesse D. Schold, firstname.lastname@example.org