Diminishing Significance of HLA Matching
in Kidney Transplantation

Authors


*Corresponding author: Glenn M. Chertow, chertowg@medicine.ucsf.edu

Abstract

To determine trends in the significance of HLA matching and other risk factors in kidney transplantation, we analyzed data on graft survival in a consecutive sample of 33 443 transplant recipients who received deceased donor kidneys from December 1994 to December 1998 with a mean follow-up time of 2.2 years. HLA matching and other risk factors (peak panel reactive antibody, donor age, sex and cause of death, cold ischemia time, donor and recipient body size) were examined. Mean likelihood ratios of models, fit with and without each variable of interest, were calculated by generating bootstrapped samples from each single year cohort. Pooled censored and uncensored graft survival rates were 90.6% and 89.9% at 1 year, 85.8% and 84.5% at 2 years, and 80.7% and 78.6% at 3 years. HLA matching declined in significance while other factors retained similar levels of statistical significance over the four yearly cohorts. With evolving clinical practice, including the provision of safer and more potent immunosuppressive therapy, the significance of HLA matching has diminished. Non-immunologic factors continue to impede more marked improvements in long-term graft survival. Recognizing these trends, organ allocation algorithms may need to be revised.

Introduction

More than 300 000 persons in the United States (US) have end-stage renal disease (ESRD) with treated incidence estimates increasing at a rate of approximately 6% per year (1). The pace of ESRD program growth has been attenuated owing to a distressingly high mortality rate among individuals on dialysis (>20% annually). Survival appears to be significantly enhanced with kidney transplantation, although randomized clinical trials comparing dialysis with transplantation have not been conducted. Generally, healthier patients with ESRD are evaluated for, and receive, kidney transplantation. While early (1-year) graft survival exceeds 90% in most centers, the ultimate success of kidney transplantation remains muted because of long-term graft loss. Reasons for attenuated graft survival include the effects of rejection, calcineurin inhibitor-related nephrotoxicity, and progressive noninflammatory graft failure associated with an unfavorable balance between nephron supply and recipient metabolic demand (2–8).

Organ allocation algorithms have been designed and implemented to balance equity and efficiency. Potential recipients are given points proportional to their time on the waiting list and additional priority based on results of human leukocyte antigen (HLA) matching and population sensitization as estimated by the peak percent panel reactive antibody. Non-immunologic factors have not been considered in allocation algorithms, although the respective roles of HLA matching and nonimmunologic factors (i.e. donor age, sex and cause of death, cold ischemia time, donor and recipient body surface area) have been the topic of numerous reports (7–11).

We hypothesized that the relative significance of HLA matching in determining graft survival had diminished over the past several years, while the significance of nonimmunologic risk factors had not. We aimed to determine whether trends in the explanatory power of HLA matching and nonimmunologic variables could be identified.

Materials and Methods

To test these hypotheses, we used data from the United Network of Organ Sharing (UNOS) and United States Renal Data System (USRDS). The USRDS collates and analyzes data on approximately 95% of all persons with ESRD (whether on dialysis or following transplantation) in the US. Data elements included: date of transplantation, recipient age, sex, race, dialysis modality (peritoneal vs. hemodialysis), vintage (time since initiation of dialysis), body size (expressed as body surface area or body mass index), blood type, HLA antigens, peak percent panel reactive antibody (PRA), immunosuppressive medication use (prednisone, cyclosporine, azathioprine and mycophenolate mofetil), pretransplant blood transfusions, employment status, donor age, sex, race, body size, cause of death, blood type, HLA antigens and cold ischemia time. Recipient comorbid conditions included hypertension, cardiovascular disease, peripheral vascular disease, unstable angina, malignancy, chronic obstructive pulmonary disease, pulmonary embolism, peptic ulcer disease and cachexia.

Our study was restricted to deceased donor kidney transplants performed between December 1994 and December 1998 (n = 33 443). We divided this sample into four cohorts according to transplant year (1995 and before, 1996, 1997, and 1998). The sizes of these cohorts were 9035, 8279, 8218, and 7911, respectively.

Statistical analysis

Continuous variables were expressed as mean ± SD or median ± interquartile range. Categorical variables were described with proportions. There were few missing data elements. Continuous variables were categorized as follows (7): cold ischemia time as 0–8, 9–16, 17–24, 25–36, 37–48 and >48 h; vintage as <2, 2–5 and >5 years; and peak pRA as 0, 1–20, 21–40, 41–60, 61–80 and 81–100%. Missing data for cold ischemia time and vintage were placed into distinct categories using a ‘missing’ indicator variable in regression analyses. Survival rates were estimated using the Kaplan-Meier product limit method (12). Graft failure hazards associated with explanatory variables were estimated by fitting multivariable proportional hazards (‘Cox’) regression models (13). The proportionality of hazard increments was validated using log-log plots of survival curves. We pre-specified several multiplicative interaction terms, including: dialysis modality by recipient age, sex, race and diabetes; HLA matching by recipient age, sex, race, and cold ischemia time; and donor body size by recipient body size. Two-tailed p-values less than 0.05 were considered statistically significant.

We evaluated the significance of an explanatory variable using the likelihood ratio test. The variable of interest was dropped from the model, and the ratio of the likelihood of the reduced model to the likelihood of the full model was calculated. To determine trends in significance and to adjust for differences in population size between the single-year cohorts, 50 bootstrap random samples of size 1000 (randomly chosen without replacement) were generated from each yearly cohort and the likelihood ratio test was applied to each bootstrap replicate (14). We examined multiple samples of equivalent cohort size to avoid any influence of yearly cohort sample size on statistical significance. To test for sensitivity on the bootstrap procedure, we considered an alternate method where the bootstrap sample size was equal to the yearly cohort size and sampling was carried out with a replacement.

Linear trends in variable significance were detected by regressing the likelihood ratio test statistics (i.e. – 2 log likelihood ratios of the reduced over the full model) against cohort year. This procedure was separately applied to assess the significance of each variable of interest. Analyses considered both censored and uncensored graft survival. In censored graft survival, death with a functioning graft is not considered a graft loss. In contrast, uncensored graft survival considers death with a functioning graft as a graft loss.

All analyses were conducted using S-Plus software (Insightful Corp., Seattle, WA).

Results

Trends in transplant population

Table 1 displays the baseline characteristics of recipients and donors in the study sample. Separate summary statistics are presented for each yearly cohort. To detect trends in the composition of our study sample, we regressed each covariate against time (linear regression for continuous variables and logistic regression for categorical variables) and identified covariates with significant trends (p < 0.05). Among recipient characteristics, age, weight and body surface area increased, as did the proportions of recipients with diabetes and employed or student status. The proportion of patients undergoing repeat transplantation significantly decreased. There was a decrease in the proportion of Caucasian recipients. Dialysis vintage also increased significantly. The use of cyclosporine and azathiopine decreased significantly and the use of mycophenolate mofetil increased significantly. Among donor characteristics, there was a significant increase in the proportion of donors aged 40 years and older, and a significant decrease in the proportion of donors aged less than 40 years. The proportion of donors whose cause of death was coded as drug ingestion, trauma, or violence decreased, with a corresponding increase in other or unknown causes of death. The proportion of donors with five and six HLA mismatches increased significantly.

Table 1.  Recipient and donor characteristics
 All95969798p
Recipient factors
 Age, mean ± SD, year44.3 ± 13.843.2 ± 13.844.3 ± 13.844.8 ± 13.645.2 ± 13.7<0.05
 Female,%39.237.94039.539.7 
 Race/ethnicity,%
  Caucasian69.570.869.768.768.7<0.05
  African American25.424.925.226.225.2 
  Asian American3.53.13.43.93.4 
  Hispanic0.70.70.90.60.8 
  Other0.90.50.80.61.9<0.05
 Height, mean ± SD, m1.69 ± 0.151.67 ± 0.171.69 ± 0.131.69 ± 0.151.69 ± 0.15 
 Weight, mean ± SD, kg73.7 ± 18.873.4 ± 18.473.7 ± 19.174.4 ± 19.575.3 ± 19.8<0.05
 Body surface area, mean ± SD, m21.85 ± 0.311.84 ± 0.311.84 ± 0.301.85 ± 0.311.86 ± 0.30<0.05
 Body mass index, mean ± SD, kg/m226.2 ± 6.526.3 ± 6.226.0 ± 6.526.2 ± 6.626.3 ± 6.8 
 Diabetes,%24.419.224.525.927.6<0.05
 Previous kidney transplantation,%12.412.712.312.411.9<0.05
 Employed or school, full/part-time,%61.158.261.562.161.3<0.05
 Pretransplant transfusions, median (interquartile range)1 (0,2)1 (0,2)1 (0,2)1 (0,2)1 (0,2) 
 Peak panel reactive antibody, median (interquartile range),%2 (0,10)2 (0,10)3 (0,11)2 (0,10)2 (0,10) 
 Vintage, median (interquartile range), year2.19 (1.17,3.73)2.01 (1.08,3.35)2.15 (1.15,3.65)2.25 (1.19,3.86)2.29 (1.25,4.01)<0.05
 Immunosuppression use,%96.896.296.897.696.5 
  Prednisone,%93.693.493.994.293 
  Cyclosporine,%23.148.219.512.59.3<0.05
  Azathioprine,%3764.635.225.618.9<0.05
  Mycophenolate Mofetil,%64.241.767.274.176.4<0.05
Donor factors
 Age,%
  0–95.55.65.25.75.9 
  10–1921.423.220.919.518.4<0.05
  20–3934.134.633.834.133.6<0.05
  40–4917.81718.318.118.9<0.05
  50–59141314.414.914.1<0.05
  >607.26.67.47.79.1<0.05
 Female,%39.639.239.839.941.2<0.05
Race/ethnicity,%
  Caucasian85.886.48684.586.2<0.05
  African American11.111.111.111.610.6 
  Asian American11.41.41.20<0.05
  Hispanic0.30.50.30.30<0.05
  Other1.80.71.22.23.1<0.05
 Cause of death,%
  Drug ingestion, trauma, or violence50.753.450.650.148.3<0.05
  Cerebrovascular or cardiovascular37.135.137.836.838.8 
  Drowning or asphyxiation98.88.79.29.4 
  Other or unknown3.22.733.93.4<0.05
 Cold ischemia time,%
  0–865.75.86.26.3<0.05
  9–1637.427.127.32827.3 
  17–2434.835.435.135.633 
  25–3623.225.824.221.121.3<0.05
  37–483.13.63.62.42.6<0.05
  >480.30.40.30.20.2<0.05
  Unknown5.22.13.46.59.1<0.05
 HLA mismatch,%
  0 MM13.412.413.314.114.0 
  1 MM6.46.26.76.16.4 
  2 MM13.014.413.312.112.2 
  3 MM20.822.021.019.720.1 
  4 MM22.323.422.122.421.0 
  5 MM15.915.215.516.216.9<0.05
  6 MM7.45.87.28.38.3<0.05
  Unknown0.80.60.71.00.9 

Pooled censored and uncensored graft survival rates were 90.6% and 89.9% at 1 year, 85.8% and 84.5% at 2 years, and 80.7% and 78.6% at 3 years.

Trends in the importance of selected explanatory variables

Table 2 shows the relative risk of graft failure associated with the total number of HLA antigen-mismatches at A, B, and DR loci, arranged according to transplant year. In 1995, three- to six-antigen mismatches were significantly associated with a higher risk of graft failure. In 1996, four- to six-antigen mismatches were significant. In 1997, five- and six-antigen mismatches were significant. In 1998, the risk of graft failure was significantly increased only with mismatches at all six loci. In companion analyses, we considered the independent effect of the number of mismatches at each locus. Results using this approach were virtually identical to the former method.

Table 2.  HLA mismatch and relative risk of graft failure

HLA mismatch
1995
RR
1996
RR
1997
RR
1998
RR
  1. Data in bold are statistically significant; referent group 0 HLA mismatches.

11.0330.9981.0200.991
21.1011.0001.1001.050
31.0881.0651.2231.026
41.1181.1731.1461.063
51.1181.1541.1911.107
61.2651.2271.2861.152

As expected, there was a strong association between donor age and graft failure. Table 3 demonstrates that recipients of kidneys from donors age 55 years and older experienced accelerated graft loss in virtually all yearly cohorts. This association remained relatively stable and was statistically significant during the entire study period. A significant increase in risk among recipients of kidneys from donors less than 5 years old was also observed.

Table 3.  Donor age and relative risk of graft failure

Donor age (years)
1995
RR
1996
RR
1997
RR
1998
RR
  1. Data in bold are statistically significant; referent group donor age 25–29 years.

30–340.9520.9720.9930.935
35–390.9581.0761.1220.918
40–441.0531.1081.1530.999
45–491.0731.2151.1821.101
50–541.1501.0791.1641.273
55–591.2661.3741.3041.311
60–641.2751.3941.5011.473
65+1.3841.5881.4121.415

Cold ischemia time was associated with graft failure. Table 4 shows that the relative risk of graft failure was significantly increased with longer cold ischemia time, here dichotomized at 36 h. This association remained relatively stable throughout the entire study period.

Table 4.  Cold ischemia time and relative risk of graft failure

Cold ischemia time (h)
1995
RR
1996
RR
1997
RR
1998
RR
  1. Data in bold are statistically significant; referent cold ischemia time 0–8 h.

9–161.0101.0420.9731.013
17–241.0411.0501.0041.073
25–361.0741.0191.0321.153
37–481.1841.1441.0561.227
48+1.0881.2491.2251.386

We plotted the Kaplan-Meier survival curves for each yearly cohort, stratified by HLA matching (Figure 1A), donor age (Figure 1B), and cold ischemia time (Figure 1C). In Figure 1 (A), survival curves for all seven strata of HLA mismatch (zero to six) are clearly distinguishable in 1995, but they are much closer together, and some overlap, in later years. In contrast, the survival curves for the two prototype nonimmunologic factors, donor age and cold ischemia time, are similar over time.

Figure 1.

Figure 1.

Kaplan-Meier survival curves for each yearly cohort. Survival stratified by HLA mismatches (A), donor age (B) and cold ischemia time (C).

Figure 1.

Figure 1.

Kaplan-Meier survival curves for each yearly cohort. Survival stratified by HLA mismatches (A), donor age (B) and cold ischemia time (C).

Figure 1.

Figure 1.

Kaplan-Meier survival curves for each yearly cohort. Survival stratified by HLA mismatches (A), donor age (B) and cold ischemia time (C).

Significance of immunologic and non-immunologic risk factors in aggregate

The results described above and shown in Figure 1 suggest that while the significance of HLA matching has diminished over time, the estimated effects of other risk factors (such as donor age and cold ischemia time) have remained relatively stable. Figure 2 validates and extends these findings, incorporating the effects of other variables (using multivariable regression) and extending the range of confidence limits with the bootstrapping procedure. In the graphs, each point represents a bootstrap sample drawn from the cohort year corresponding to its x-value, and the y-value shows the likelihood ratio test statistic. In other words, the test statistic describes the marginal variation explained by the variable of interest (HLA matching in Figure 2A, donor age in Figure 2B and cold ischemia time in Figure 2C) by comparing models that fit with and without the variable of interest. Thus, the magnitude of the – 2 log likelihood ratio on the y-axis for each variable during each cohort year indicates its relative importance – compared with itself by year, as well as with other variables more generally.

Figure 2.

Figure 2.

Significance of risk factors in terms of likelihood ratio test statistic. Significance of HLA matching (A), donor age (B) and cold ischemia time (C).

Figure 2.

Figure 2.

Significance of risk factors in terms of likelihood ratio test statistic. Significance of HLA matching (A), donor age (B) and cold ischemia time (C).

Figure 2.

Figure 2.

Significance of risk factors in terms of likelihood ratio test statistic. Significance of HLA matching (A), donor age (B) and cold ischemia time (C).

In Figure 2 (A), the least-squares regression line has a significant downward slope (p < 0.0001) indicating that the effect of HLA matching diminished in significance from 1995 to 1998. However, in Figure 2A,B, the slopes of the regression lines were not significant, indicating that the estimated effects of donor age and cold ischemia time remained relatively stable over time. The same approach was repeated for other nonimmunologic variables, including donor sex, race and cause of death, and donor and recipient body size (body surface area and body mass index); the significance levels of these covariates were found to remain stable over the study period (data not shown). None of the pre-specified interaction terms were statistically significant when added individually in multivariable regression models.

For sensitivity analyses, we fit companion models excluding patients who experienced graft loss within the first 90 days. We also considered uncensored (instead of censored) graft failure as the principal outcome measure. To test for robustness against the bootstrap procedure, we alternatively sampled with the replacement and used the yearly cohort size for each bootstrap replicate. Results were virtually identical for all sensitivity analyses conducted (data not shown).

Discussion

Herein we show diminishing significance of HLA matching in deceased donor kidney transplantation over the period 1995–2001. We do not dismiss the reported associations among HLA matching, panel reactivity, and graft survival, which have been documented by several groups (15–18). However, prior reports have generally pooled data over extended periods, ranging from 4 to 12 years, and several have used projected, rather than actual, graft survival rates. For example, Takemoto et al. (18) evaluated data on recipients transplanted in the US between 1987 and 1999 and a subcohort transplanted between 1994 and 1999. The authors reported a significant increase in the estimated half-life of HLA-matched (6-antigen match) vs. HLA-mismatched (1 or more antigen mismatch) deceased donor kidneys, making the assumption of a constant hazard after year one. Opelz (19) reported a graded effect of HLA antigen mismatch in a European cohort of recipients transplanted between 1996 and 1999. While the pooled effect of HLA matching was significant in these studies, no results on trends in significance over time were presented. These trends are especially important in view of changes in immunosuppressive therapy over the same time frame.

Organ allocation algorithms that base priority on HLA matching have important societal implications. As the donor pool is disproportionately White, priority based on HLA matching adversely affects African American recipients and persons of other racial and ethnic minority groups (20,21). African Americans comprise a disproportionate fraction of the ESRD population, and experience longer waiting times for deceased donor kidneys. Therefore, maintaining an emphasis on HLA matching when such an association is weakening would further disadvantage African Americans in future years. Tissue matching-driven algorithms further complicate the allocation process because of a requisite increase in cold ischemia time with organ sharing. Using estimates of the effect of HLA mismatch in 1995, national sharing of zero-antigen mismatched kidneys is advisable only when the increment in cold ischemia time is less than 17 h. If HLA matching were less important now, the acceptable increment in cold ischemia time may be even lower. The dominant effect of cold ischemia time reconciles the significant observed improvement in transplant outcomes associated with living unrelated relative to deceased donor kidney transplantation (22).

The US kidney allocation algorithm has undergone several recent modifications. An Expanded Criteria Donors (ECD) program was implemented in October 2002 to expedite the placement of kidneys from older donors (i.e. ≥60 years of age, or donors 50–59 years of age with hypertension, stroke or preexisting kidney disease). The allocation algorithm was changed further in May 2003 to reduce the influence of HLA matching in an effort to improve transplantation access for minority candidates. Future modifications that incorporate nonimmunologic factors in allocation algorithms would be expected to increase graft survival, reduce waiting times and increase the proportion of ESRD patients who eventually undergo transplantation (23).

We were unable to determine the biological reason(s) for the findings described. It is tempting to attribute the changes to enhanced immunosuppressive therapy (e.g. increasing use of mycophenolate mofetil, tacrolimus, and monoclonal antibody therapy). Indeed, Meier-Kriesche et al. recently reported decreasing trends in acute rejection rates over the same time period (24). However, despite fewer episodes of acute rejection, long-term allograft survival was not significantly improved, suggesting that other factors are operative. The distribution of certain recipient factors changed significantly over the 4-year period during which patients were transplanted, and these factors may have influenced long-term graft survival. For example, older recipients with longer dialysis vintage tend to exhibit an attenuated immune response; these individuals may have been less aggressively treated because of concerns regarding infection and malignancy. The proportion of patients on hemodialysis vs. peritoneal dialysis, dialysis dose, and hemoglobin concentrations increased over the same period (1,25). Other dialysis-related effects may also be responsible, as few recipients undergo preemptive deceased donor kidney transplantation.

There are several additional limitations to this study. Data were obtained from the USRDS. The analyses depend wholly on the accuracy of the data inputs. However, errors in data entry or data coded as missing would likely bias study results toward the null (rather than a trend toward lesser significance of HLA matching). There is no reason to believe that data were less accurate in more recent years, particularly as electronic submission of data has become increasingly common (thereby reducing, rather than increasing error rates). Later cohorts had shorter follow-up times. While relevant to the direct comparison of cohort years, the shorter follow-up times in the later cohorts might have biased results toward a more prominent effect of HLA matching over time, rather than an attenuated effect. Finally, while the estimated effects of HLA matching were diminished over time, broader classes of HLA antigens that are not routinely measured, such as cross-reactive groups (CREG) (26), may have retained significance.

In summary, when examining deceased donor kidney transplants in the US, time trends suggest that HLA matching is of diminishing significance, while nonimmunologic factors remain equivalently important. Whether more potent immunosuppressive therapy or other recipient-, donor-, dialysis or transplantation-specific factors are responsible for these changes is unknown. Regardless, organ allocation algorithms that place emphasis on immunologic factors and largely ignore nonimmunologic factors may need to be revised.

Acknowledgments

Drs Zenios and Chertow were supported by NIH-NIDDK RO1 DK58411.

Ancillary