To determine trends in the significance of HLA matching and other risk factors in kidney transplantation, we analyzed data on graft survival in a consecutive sample of 33 443 transplant recipients who received deceased donor kidneys from December 1994 to December 1998 with a mean follow-up time of 2.2 years. HLA matching and other risk factors (peak panel reactive antibody, donor age, sex and cause of death, cold ischemia time, donor and recipient body size) were examined. Mean likelihood ratios of models, fit with and without each variable of interest, were calculated by generating bootstrapped samples from each single year cohort. Pooled censored and uncensored graft survival rates were 90.6% and 89.9% at 1 year, 85.8% and 84.5% at 2 years, and 80.7% and 78.6% at 3 years. HLA matching declined in significance while other factors retained similar levels of statistical significance over the four yearly cohorts. With evolving clinical practice, including the provision of safer and more potent immunosuppressive therapy, the significance of HLA matching has diminished. Non-immunologic factors continue to impede more marked improvements in long-term graft survival. Recognizing these trends, organ allocation algorithms may need to be revised.
More than 300 000 persons in the United States (US) have end-stage renal disease (ESRD) with treated incidence estimates increasing at a rate of approximately 6% per year (1). The pace of ESRD program growth has been attenuated owing to a distressingly high mortality rate among individuals on dialysis (>20% annually). Survival appears to be significantly enhanced with kidney transplantation, although randomized clinical trials comparing dialysis with transplantation have not been conducted. Generally, healthier patients with ESRD are evaluated for, and receive, kidney transplantation. While early (1-year) graft survival exceeds 90% in most centers, the ultimate success of kidney transplantation remains muted because of long-term graft loss. Reasons for attenuated graft survival include the effects of rejection, calcineurin inhibitor-related nephrotoxicity, and progressive noninflammatory graft failure associated with an unfavorable balance between nephron supply and recipient metabolic demand (2–8).
Organ allocation algorithms have been designed and implemented to balance equity and efficiency. Potential recipients are given points proportional to their time on the waiting list and additional priority based on results of human leukocyte antigen (HLA) matching and population sensitization as estimated by the peak percent panel reactive antibody. Non-immunologic factors have not been considered in allocation algorithms, although the respective roles of HLA matching and nonimmunologic factors (i.e. donor age, sex and cause of death, cold ischemia time, donor and recipient body surface area) have been the topic of numerous reports (7–11).
We hypothesized that the relative significance of HLA matching in determining graft survival had diminished over the past several years, while the significance of nonimmunologic risk factors had not. We aimed to determine whether trends in the explanatory power of HLA matching and nonimmunologic variables could be identified.
Materials and Methods
To test these hypotheses, we used data from the United Network of Organ Sharing (UNOS) and United States Renal Data System (USRDS). The USRDS collates and analyzes data on approximately 95% of all persons with ESRD (whether on dialysis or following transplantation) in the US. Data elements included: date of transplantation, recipient age, sex, race, dialysis modality (peritoneal vs. hemodialysis), vintage (time since initiation of dialysis), body size (expressed as body surface area or body mass index), blood type, HLA antigens, peak percent panel reactive antibody (PRA), immunosuppressive medication use (prednisone, cyclosporine, azathioprine and mycophenolate mofetil), pretransplant blood transfusions, employment status, donor age, sex, race, body size, cause of death, blood type, HLA antigens and cold ischemia time. Recipient comorbid conditions included hypertension, cardiovascular disease, peripheral vascular disease, unstable angina, malignancy, chronic obstructive pulmonary disease, pulmonary embolism, peptic ulcer disease and cachexia.
Our study was restricted to deceased donor kidney transplants performed between December 1994 and December 1998 (n = 33 443). We divided this sample into four cohorts according to transplant year (1995 and before, 1996, 1997, and 1998). The sizes of these cohorts were 9035, 8279, 8218, and 7911, respectively.
Continuous variables were expressed as mean ± SD or median ± interquartile range. Categorical variables were described with proportions. There were few missing data elements. Continuous variables were categorized as follows (7): cold ischemia time as 0–8, 9–16, 17–24, 25–36, 37–48 and >48 h; vintage as <2, 2–5 and >5 years; and peak pRA as 0, 1–20, 21–40, 41–60, 61–80 and 81–100%. Missing data for cold ischemia time and vintage were placed into distinct categories using a ‘missing’ indicator variable in regression analyses. Survival rates were estimated using the Kaplan-Meier product limit method (12). Graft failure hazards associated with explanatory variables were estimated by fitting multivariable proportional hazards (‘Cox’) regression models (13). The proportionality of hazard increments was validated using log-log plots of survival curves. We pre-specified several multiplicative interaction terms, including: dialysis modality by recipient age, sex, race and diabetes; HLA matching by recipient age, sex, race, and cold ischemia time; and donor body size by recipient body size. Two-tailed p-values less than 0.05 were considered statistically significant.
We evaluated the significance of an explanatory variable using the likelihood ratio test. The variable of interest was dropped from the model, and the ratio of the likelihood of the reduced model to the likelihood of the full model was calculated. To determine trends in significance and to adjust for differences in population size between the single-year cohorts, 50 bootstrap random samples of size 1000 (randomly chosen without replacement) were generated from each yearly cohort and the likelihood ratio test was applied to each bootstrap replicate (14). We examined multiple samples of equivalent cohort size to avoid any influence of yearly cohort sample size on statistical significance. To test for sensitivity on the bootstrap procedure, we considered an alternate method where the bootstrap sample size was equal to the yearly cohort size and sampling was carried out with a replacement.
Linear trends in variable significance were detected by regressing the likelihood ratio test statistics (i.e. – 2 log likelihood ratios of the reduced over the full model) against cohort year. This procedure was separately applied to assess the significance of each variable of interest. Analyses considered both censored and uncensored graft survival. In censored graft survival, death with a functioning graft is not considered a graft loss. In contrast, uncensored graft survival considers death with a functioning graft as a graft loss.
All analyses were conducted using S-Plus software (Insightful Corp., Seattle, WA).
Trends in transplant population
Table 1 displays the baseline characteristics of recipients and donors in the study sample. Separate summary statistics are presented for each yearly cohort. To detect trends in the composition of our study sample, we regressed each covariate against time (linear regression for continuous variables and logistic regression for categorical variables) and identified covariates with significant trends (p < 0.05). Among recipient characteristics, age, weight and body surface area increased, as did the proportions of recipients with diabetes and employed or student status. The proportion of patients undergoing repeat transplantation significantly decreased. There was a decrease in the proportion of Caucasian recipients. Dialysis vintage also increased significantly. The use of cyclosporine and azathiopine decreased significantly and the use of mycophenolate mofetil increased significantly. Among donor characteristics, there was a significant increase in the proportion of donors aged 40 years and older, and a significant decrease in the proportion of donors aged less than 40 years. The proportion of donors whose cause of death was coded as drug ingestion, trauma, or violence decreased, with a corresponding increase in other or unknown causes of death. The proportion of donors with five and six HLA mismatches increased significantly.
Table 1. Recipient and donor characteristics
Age, mean ± SD, year
44.3 ± 13.8
43.2 ± 13.8
44.3 ± 13.8
44.8 ± 13.6
45.2 ± 13.7
Height, mean ± SD, m
1.69 ± 0.15
1.67 ± 0.17
1.69 ± 0.13
1.69 ± 0.15
1.69 ± 0.15
Weight, mean ± SD, kg
73.7 ± 18.8
73.4 ± 18.4
73.7 ± 19.1
74.4 ± 19.5
75.3 ± 19.8
Body surface area, mean ± SD, m2
1.85 ± 0.31
1.84 ± 0.31
1.84 ± 0.30
1.85 ± 0.31
1.86 ± 0.30
Body mass index, mean ± SD, kg/m2
26.2 ± 6.5
26.3 ± 6.2
26.0 ± 6.5
26.2 ± 6.6
26.3 ± 6.8
Previous kidney transplantation,%
Employed or school, full/part-time,%
Pretransplant transfusions, median (interquartile range)
Peak panel reactive antibody, median (interquartile range),%
Vintage, median (interquartile range), year
Cause of death,%
Drug ingestion, trauma, or violence
Cerebrovascular or cardiovascular
Drowning or asphyxiation
Other or unknown
Cold ischemia time,%
Pooled censored and uncensored graft survival rates were 90.6% and 89.9% at 1 year, 85.8% and 84.5% at 2 years, and 80.7% and 78.6% at 3 years.
Trends in the importance of selected explanatory variables
Table 2 shows the relative risk of graft failure associated with the total number of HLA antigen-mismatches at A, B, and DR loci, arranged according to transplant year. In 1995, three- to six-antigen mismatches were significantly associated with a higher risk of graft failure. In 1996, four- to six-antigen mismatches were significant. In 1997, five- and six-antigen mismatches were significant. In 1998, the risk of graft failure was significantly increased only with mismatches at all six loci. In companion analyses, we considered the independent effect of the number of mismatches at each locus. Results using this approach were virtually identical to the former method.
Table 2. HLA mismatch and relative risk of graft failure
Data in bold are statistically significant; referent group 0 HLA mismatches.
As expected, there was a strong association between donor age and graft failure. Table 3 demonstrates that recipients of kidneys from donors age 55 years and older experienced accelerated graft loss in virtually all yearly cohorts. This association remained relatively stable and was statistically significant during the entire study period. A significant increase in risk among recipients of kidneys from donors less than 5 years old was also observed.
Table 3. Donor age and relative risk of graft failure
Donor age (years)
Data in bold are statistically significant; referent group donor age 25–29 years.
Cold ischemia time was associated with graft failure. Table 4 shows that the relative risk of graft failure was significantly increased with longer cold ischemia time, here dichotomized at 36 h. This association remained relatively stable throughout the entire study period.
Table 4. Cold ischemia time and relative risk of graft failure
Cold ischemia time (h)
Data in bold are statistically significant; referent cold ischemia time 0–8 h.
We plotted the Kaplan-Meier survival curves for each yearly cohort, stratified by HLA matching (Figure 1A), donor age (Figure 1B), and cold ischemia time (Figure 1C). In Figure 1 (A), survival curves for all seven strata of HLA mismatch (zero to six) are clearly distinguishable in 1995, but they are much closer together, and some overlap, in later years. In contrast, the survival curves for the two prototype nonimmunologic factors, donor age and cold ischemia time, are similar over time.
Significance of immunologic and non-immunologic risk factors in aggregate
The results described above and shown in Figure 1 suggest that while the significance of HLA matching has diminished over time, the estimated effects of other risk factors (such as donor age and cold ischemia time) have remained relatively stable. Figure 2 validates and extends these findings, incorporating the effects of other variables (using multivariable regression) and extending the range of confidence limits with the bootstrapping procedure. In the graphs, each point represents a bootstrap sample drawn from the cohort year corresponding to its x-value, and the y-value shows the likelihood ratio test statistic. In other words, the test statistic describes the marginal variation explained by the variable of interest (HLA matching in Figure 2A, donor age in Figure 2B and cold ischemia time in Figure 2C) by comparing models that fit with and without the variable of interest. Thus, the magnitude of the – 2 log likelihood ratio on the y-axis for each variable during each cohort year indicates its relative importance – compared with itself by year, as well as with other variables more generally.
In Figure 2 (A), the least-squares regression line has a significant downward slope (p < 0.0001) indicating that the effect of HLA matching diminished in significance from 1995 to 1998. However, in Figure 2A,B, the slopes of the regression lines were not significant, indicating that the estimated effects of donor age and cold ischemia time remained relatively stable over time. The same approach was repeated for other nonimmunologic variables, including donor sex, race and cause of death, and donor and recipient body size (body surface area and body mass index); the significance levels of these covariates were found to remain stable over the study period (data not shown). None of the pre-specified interaction terms were statistically significant when added individually in multivariable regression models.
For sensitivity analyses, we fit companion models excluding patients who experienced graft loss within the first 90 days. We also considered uncensored (instead of censored) graft failure as the principal outcome measure. To test for robustness against the bootstrap procedure, we alternatively sampled with the replacement and used the yearly cohort size for each bootstrap replicate. Results were virtually identical for all sensitivity analyses conducted (data not shown).
Herein we show diminishing significance of HLA matching in deceased donor kidney transplantation over the period 1995–2001. We do not dismiss the reported associations among HLA matching, panel reactivity, and graft survival, which have been documented by several groups (15–18). However, prior reports have generally pooled data over extended periods, ranging from 4 to 12 years, and several have used projected, rather than actual, graft survival rates. For example, Takemoto et al. (18) evaluated data on recipients transplanted in the US between 1987 and 1999 and a subcohort transplanted between 1994 and 1999. The authors reported a significant increase in the estimated half-life of HLA-matched (6-antigen match) vs. HLA-mismatched (1 or more antigen mismatch) deceased donor kidneys, making the assumption of a constant hazard after year one. Opelz (19) reported a graded effect of HLA antigen mismatch in a European cohort of recipients transplanted between 1996 and 1999. While the pooled effect of HLA matching was significant in these studies, no results on trends in significance over time were presented. These trends are especially important in view of changes in immunosuppressive therapy over the same time frame.
Organ allocation algorithms that base priority on HLA matching have important societal implications. As the donor pool is disproportionately White, priority based on HLA matching adversely affects African American recipients and persons of other racial and ethnic minority groups (20,21). African Americans comprise a disproportionate fraction of the ESRD population, and experience longer waiting times for deceased donor kidneys. Therefore, maintaining an emphasis on HLA matching when such an association is weakening would further disadvantage African Americans in future years. Tissue matching-driven algorithms further complicate the allocation process because of a requisite increase in cold ischemia time with organ sharing. Using estimates of the effect of HLA mismatch in 1995, national sharing of zero-antigen mismatched kidneys is advisable only when the increment in cold ischemia time is less than 17 h. If HLA matching were less important now, the acceptable increment in cold ischemia time may be even lower. The dominant effect of cold ischemia time reconciles the significant observed improvement in transplant outcomes associated with living unrelated relative to deceased donor kidney transplantation (22).
The US kidney allocation algorithm has undergone several recent modifications. An Expanded Criteria Donors (ECD) program was implemented in October 2002 to expedite the placement of kidneys from older donors (i.e. ≥60 years of age, or donors 50–59 years of age with hypertension, stroke or preexisting kidney disease). The allocation algorithm was changed further in May 2003 to reduce the influence of HLA matching in an effort to improve transplantation access for minority candidates. Future modifications that incorporate nonimmunologic factors in allocation algorithms would be expected to increase graft survival, reduce waiting times and increase the proportion of ESRD patients who eventually undergo transplantation (23).
We were unable to determine the biological reason(s) for the findings described. It is tempting to attribute the changes to enhanced immunosuppressive therapy (e.g. increasing use of mycophenolate mofetil, tacrolimus, and monoclonal antibody therapy). Indeed, Meier-Kriesche et al. recently reported decreasing trends in acute rejection rates over the same time period (24). However, despite fewer episodes of acute rejection, long-term allograft survival was not significantly improved, suggesting that other factors are operative. The distribution of certain recipient factors changed significantly over the 4-year period during which patients were transplanted, and these factors may have influenced long-term graft survival. For example, older recipients with longer dialysis vintage tend to exhibit an attenuated immune response; these individuals may have been less aggressively treated because of concerns regarding infection and malignancy. The proportion of patients on hemodialysis vs. peritoneal dialysis, dialysis dose, and hemoglobin concentrations increased over the same period (1,25). Other dialysis-related effects may also be responsible, as few recipients undergo preemptive deceased donor kidney transplantation.
There are several additional limitations to this study. Data were obtained from the USRDS. The analyses depend wholly on the accuracy of the data inputs. However, errors in data entry or data coded as missing would likely bias study results toward the null (rather than a trend toward lesser significance of HLA matching). There is no reason to believe that data were less accurate in more recent years, particularly as electronic submission of data has become increasingly common (thereby reducing, rather than increasing error rates). Later cohorts had shorter follow-up times. While relevant to the direct comparison of cohort years, the shorter follow-up times in the later cohorts might have biased results toward a more prominent effect of HLA matching over time, rather than an attenuated effect. Finally, while the estimated effects of HLA matching were diminished over time, broader classes of HLA antigens that are not routinely measured, such as cross-reactive groups (CREG) (26), may have retained significance.
In summary, when examining deceased donor kidney transplants in the US, time trends suggest that HLA matching is of diminishing significance, while nonimmunologic factors remain equivalently important. Whether more potent immunosuppressive therapy or other recipient-, donor-, dialysis or transplantation-specific factors are responsible for these changes is unknown. Regardless, organ allocation algorithms that place emphasis on immunologic factors and largely ignore nonimmunologic factors may need to be revised.
Drs Zenios and Chertow were supported by NIH-NIDDK RO1 DK58411.