Lack of Improvement in Renal Allograft Survival Despite a Marked Decrease in Acute Rejection Rates Over the Most Recent Era



Acute rejection is known to have a strong impact on graft survival. Many studies suggest that very low acute rejection rates can be achieved with current immunosuppressive protocols. We wanted to investigate how acute rejection rates have evolved on a national level in the U.S. and how this has impacted graft survival in the most recent era of kidney transplantation. For this purpose, we analyzed data provided by the Scientific Registry of Transplant Recipients regarding all adult first renal transplants between 1995 and 2000.

We noted a significant decrease in overall acute rejection rates during the first 6 months, during the first year, and also in late rejections during the second year after transplantation. Despite this decrease in the rate of acute rejection, there was no significant improvement in overall graft survival; furthermore, we noted a statistically significant trend towards worse death-censored graft survival. There was also a trend for a greater proportion of rejection episodes to fail to recover to previous baseline function after treatment.

Our data suggest that decreasing acute rejection rates between 1995 and 2000 have not led to an increase in long-term graft survival. Part of this discordance might be related to a higher proportion of acute rejections which have not resolved with full functional recovery in more recent years. However, the etiology of this concerning trend for worse death censored graft survival in recent years will warrant further investigation.


Acute rejection has been shown to be one of the strongest negative prognostic factors for long-term graft survival after kidney transplantation (1–5). These studies implied that a reduction in early acute rejection rates would lead to improvements in long-term graft survival (5). This led investigators to postulate that early acute rejection could predict long-term graft survival (1,4). From the era of 1988 until 1996, both acute rejection rates and graft survival rates were improving (6), lending support for this postulate.

On the other hand, new therapeutic regimens, while reducing the incidence of acute rejection, have often times failed to show a significant beneficial effect on long-term graft survival. In some cases this may be owing to trials powered to show a statistically significant difference in acute rejection rates but not adequately powered for the rarer and later end point of graft survival.

Numerous groups have attempted to develop clinical and histopathological criteria to further characterize the relation between acute rejection and graft survival (7–9). In most clinical trials, acute rejection rates, histological grades and steroid responsiveness are reported, while response to therapy in terms of functional recovery is usually not reported. In 1972, Silcott et al. reported that of those renal allografts ‘which were unable to return the serum creatinine value to within 20% of the pre rejection levels, 93% ultimately failed and 47% of the patients died’, compared with a failure rate of 27% in patients with acute rejection but no loss of function within 20%. More recent studies reemphasize the concept that acute rejection might not have any deleterious effect on long-term graft survival provided complete functional recovery is achieved (10,11).

With the present study, we first determined trends in acute rejection rates and long-term graft survival and subsequently proceeded to investigate whether continued improvements in acute rejection rates have translated into improved graft survival.


We examined all 62 103 first solitary adult (age > 17 years) transplant recipients included in the Scientific Registry of Transplant Recipients (SRTR) database transplanted between 1995 and 2000. The year of transplant was considered the main variable of interest to ascertain the change in outcomes of graft survival, death-censored graft survival, and patient survival over the modern era. We constructed univariate models for these outcomes by year of transplantation. We then generated models to quantify the impact of the year of transplant for those recipients with a minimum 1-year follow up. Subsequently, we analyzed the effect of transplantation year separately by the presence of acute rejection within the first 6 months of transplant. To examine the impact of year of transplantation in conjunction with donor type (deceased donor or living), we utilized an interaction model in which groupings were created for each year and donor type combination.

We measured the impact of renal function in the presence and absence of acute rejection, analyzing 38 426 adult first transplant recipients from 1995 to 2001 from the SRTR database. We excluded recipients with less than 1 year of follow up or missing values of creatinine in the 6- and 12-month post-transplantation intervals from the analysis. We then took a subset of patients that had no acute rejection indicated within the first 6 months of follow up and used their creatinine levels at 6 months and 12 months as an indication of renal function. We recorded acute rejection incidents for patients that had indications of treatment for acute rejection on their follow-up forms. Patients were followed until graft loss, death, or their last patient follow-up date as indicated in the registry.

We verified results by repeating analyses using (serum creatinine)−1, glomerular filtration rate (GFR) as calculated both by Cockcroft Gault (12) (adjusted for body surface area), and the modification of diet in renal disease (MDRD) (13) formulae as measures of renal function. Calculations incorporating weight and height utilized imputed values. Our models incorporated imputed values of weight and height, using nonmissing values of these measurements along with age as predictors stratified by race and gender. Baseline renal function measures were calculated at 6 months and compared with levels at 12 months post-transplant follow up. We considered return to baseline for patients with at least 95% renal function at 12 months as they had at 6 months; this level of baseline function was conceived to account for some degree of measurement error and intrapatient variability. We compared outcomes of patient survival, death-censored graft survival, and overall graft survival for those who did not have acute rejection in the first 12 months, those who did not have acute rejection in the first 6 months but did between 6 and 12 months and returned to baseline renal function level, and those without acute rejection in their first 6 months and had acute rejection between 6 and 12 months but failed to return to baseline level of renal function. In subsequent analyses, groupings for patients with acute rejection and who failed to return to baseline were further delineated into 85–95% baseline function, 75–85% baseline function, and <75% baseline function to measure the relative impact on the hazard of outcomes with more serious gradients of renal function reduction. Subsets of the patient population were also separately analyzed to assess if the trends for outcomes remained consis ent.

Outcomes were measured with univariate Kaplan-Meier models and overall strata comparisons measured by log-rank tests. Multivariate Cox proportional hazard models were used to assess risk for groupings and adjust for potential confounding factors. Cox models were corrected for induction regimen, antiproliferative regimen, calcineurin inhibitor, cold ischemia time, PRA level, HLA mismatches (separated into A, B and DR), donor and recipient age, gender, and race, presence of delayed graft function, donation type, and primary diagnosis of recipient. Proportional hazard assumptions were tested by visually assessing log-log survival curves. Medication regimens were considered solely on an intent-to-treat basis, without regard to changes after the initial transplant period. We tested for linear trends for demographic characteristics over the years of transplant with the Cochran-Armitage trend test. The Efron method was used to handle tied outcome occurrences. All analyses were conducted on SAS software (v. 8.02, Cary, NC) and a type one error probability of 0.05 was utilized as an indication of statistical significance.


Demographic information by year of transplantation by donation type is displayed in Tables 1. We demonstrated over the study period, a significant linear positive trend for 2-HLA-A,B and DR mismatches, waiting time on dialysis greater than 24 months, diabetes as primary diagnosis, recipients older than 65 years, unrelated donations, and medications at baseline including MMF, Neoral, Prograf, generic cyclosporine, IL-2 induction, and Thymoglobulin for the living transplant population. Significant negative trends in the living population existed for patients with PRA > 30 and for medications at baseline including AZA, cyclosporine, OKT3, and ATG.

Table 1. Demographic information by year of transplant
 199519961997199819992000Trend test1 (direction)
  1. 1Cochran-Armitage trend test: indications for significant trends (α < 0.05 for two-sided test) over year of transplant.

Living transplants
 2 HLA-A MM percentage10.512.812.815.516.118.3Positive
 2 HLA-B MM percentage15.718.819.322.023.126.0Positive
 2 HLA-DR MM percentage10.612.913.515.015.018.5Positive
 PRA ≥ 30%
 Donor Age > 55%
 Recipient Age > 65%
 Waiting time on dialysis 24 months +%15.816.215.815.515.716.3None
 AA Donor percentage13.612.813.512.814.912.9None
 AA Recipient percentage14.414.414.
 Diabetes as primary disease percentage16.414.712.413.416.617.4Positive
 Hypertension as primary disease percentage10.211.512.911.711.511.7None
 MMF percentage10.144.461.970.977.071.7Positive
 AZA percentage71.732.821.
 Neoral percentage4.055.869.061.256.446.1Positive
 Prograf percentage3.37.412.623.428.437.7Positive
 Cyclosporine percentage79.817.
 Generic cyclosporine percentage0.
 Thymoglobulin percentage0.
 ATG percentage1.711.
 Unrelated donation percentage13.515.917.
Deceased donor transplants
 2 HLA-A MM percentage40.841.542.940.641.142.1None
 2 HLA-B MM percentage36.036.838.137.339.241.4Positive
 2 HLA-DR MM percentage18.519.421.020.822.922.8Positive
 PRA ≥ 30%14.515.114.913.815.215.7None
 Donor Age > 55%14.316.116.716.617.016.8Positive
 Recipient Age > 65%
 Waiting time on dialysis 24 months +%48.651.053.854.053.954.9Positive
 AA donor percentage11.311.311.111.310.210.2Negative
 AA recipient percentage27.327.528.427.828.129.7Positive
 Diabetes as primary disease percentage10.810.410.310.413.015.7Positive
 Hypertension as primary disease percentage15.516.618.217.817.718.4Positive
 Cold ischemia time > 24 h percentage37.736.231.631.428.325.3Negative
 MMF percentage10.242.862.568.474.872.4Positive
 AZA percentage71.330.918.810.96.94.5Negative
 Neoral percentage2.150.964.559.849.843.5Positive
 Prograf percentage5.211.417.422.830.139.8Positive
 Cyclosporine percentage79.916.
 Generic cyclosporine percentage0.
 Thymoglobulin percentage0.
 ATG percentage1.416.720.

The 2-year survival estimates from univariate analyses for overall graft survival by donation type for recipients with and without indications of acute rejection within 6 months of transplant are displayed in Table 2. The rates of early and late acute rejection episodes are displayed in Figure 1; rates showed a strong decline after 1996 in the follow-up periods that we examined.

Table 2. Two-tier univariate overall graft survival rates for recipients with and without indications of acute rejection within 6 months post-transplant by donation type
 No indication of acute rejectionAcute rejection
Year of transplantDeceased donorLivingDeceased donorLiving
  1. Recipients utilized in calculations had a minimum 6-month follow-up period.

Figure 1.

Incidence of early and late acute rejection episodes by era. Indications of rejection are not independent; patients may have contributed repeated episodes of rejection in different follow-up periods.

Cox model hazard estimates and 95% confidence intervals adjusted for covariates for the outcome of overall graft loss for deceased donor transplants 1995–2000 demonstrated a slight elevation in risk, while living transplants seemed to be rather flat in terms of risk estimates by year of transplant (displayed in Figure 2). As displayed in Figure 3, death-censored graft loss hazard estimates for the 1995–2000 period seem to suggest an increased risk in more recent years for both cadaveric and living transplants. We also examined the outcome of patient death; hazard estimates for this outcome for deceased donor transplant subset for 1996–2000 (with 1995 as reference) were 0.994 (0.902, 1.095), 1.134 (1.019 1.262), 0.984 (0.876, 1.105), 1.077 (0.952, 1.219), and 1.103 (0.963, 1.264). Living transplants' relative risks for 1995–2000 with the same reference group (cadaveric transplant 1995) were 0.752 (0.653, 0.865), 0.710 (0.606, 0.832), 0.740 (0.625, 0.876), 0.770 (0.646, 0.917), 0.660 (0.542, 0.803), and 0.660 (0.537, 0.810), respectively.

Figure 2.

Relative risk for overall graft loss by donor type. Model corrected for induction, antiproliferative, and inhibitor medication regiments at baseline, cold ischemia time, PRA level, HLA-A, -B, and -DR mismatches, recipient and donor gender, ethnicity, and age, presence of delayed graft function, primary diagnosis, and waiting time on dialysis.

Figure 3.

Relative risk for death-censored graft loss by donor type. Model corrected for induction, antiproliferative, and inhibitor medication regiments at baseline, cold ischemia time, PRA level, HLA-A, -B, and -DR mismatches, recipient and donor gender, ethnicity, age, presence of delayed graft function, primary diagnosis, and waiting time on dialysis.

To examine the impact of acute rejection, we analyzed outcomes of those who experienced no acute rejection in the 6–12-month period post transplant, those who experienced acute rejection and returned to baseline renal function, and those who experienced acute rejection and failed to return to baseline. Utilizing (serum creatinine)−1 as the measure of renal function, 53.9% of patients returned to their 6-month baseline renal function level (1237/2296) at 12 months. This return rate depicted a negative trend by year of transplant, as displayed in Table 3, peaking in 1996 at 68.4% and dipping to 40.6% in 2000. This linear trend towards a reduced rate of returning to baseline was statistically significant as tested by the Cochran-Armitage trend test (p < 0.001). The acute rejection rates for the entire follow-up periods that we investigated indicated later acute rejection rates were consistently higher in the group that had initially displayed acute rejection in the second 6-month post-transplant follow-up period. The rate of rejection in the second year post-transplant for the no acute rejection group was 5.7%, for the return to baseline group 16.3%, and for the fail to return to baseline group 27.7%. In a similar fashion, during the third year post transplant the rejection rates were 3.5% for the no acute rejection group, 6.4% for the return to baseline group, and 11.3% for the fail to return to baseline. Rates in the fourth year post-transplant followed the same pattern; for the no acute rejection group the rate of acute rejection was 3.0%, for the return to baseline group 5.0%, and for the failure to return to baseline group 8.1%.

Table 3. Rate of return to baseline function after acute rejection by era
Year of transplantReturn to baselineNo return to baselineRate of return
  1. Return to baseline function estimated by 1/Scr.

  2. Significant linear trend (p < 0.001) towards no return to baseline as tested by the Cochran-Armitage trend test.


We analyzed the impact of the acute rejection/renal function groupings on overall graft survival. Examining the outcomes for the three group designations (no acute rejection, rejection with return to baseline, and rejection without return to baseline) utilizing Cockcroft-Gault as an estimate of GFR yield 3-year-unadjusted survival rates of 91.5%, 91.1%, and 72.1%, respectively, (see Table 4). The 6-year unadjusted rates were 74.4%, 72.7%, and 50.4%, with an overall test for equality of strata being highly significant also (p < 0.0001), when testing the model with the first two strata only, there was no significant difference between the groups with no acute rejection and with acute rejection and return to baseline (p = 0.5289). Subsequently, we examined the results using the MDRD estimate of GFR and (serum creatinine)−1 and found very similar results. We also analyzed the overall graft survival by use of five acute rejection/renal function levels (no acute rejection, acute rejection and return to baseline, acute rejection and a return to 85–95% renal function, acute rejection and return to 75–85% renal function, and acute rejection and a return to less than 75% renal function). The 6-year overall unadjusted graft survival rates by functional status after rejection as measured by Cockcroft-Gault as an estimate for GFR were 74.4%, 72.7%, 67.0%, 50.2%, 38.0% (see Figure 4), respectively.

Table 4. Unadjusted overall graft survival by acute rejection/renal function group
Group3-year overall graft survival6-year overall graft survival
No acute rejection91.574.4
Acute rejection and return91.172.7
 to baseline
Acute rejection and fail to72.150.4
 return to baseline
Figure 4.

Kaplan-Meier plot of overall graft survival by acute rejection/glomerular filtration rate grouping levels.

We were also interested in measuring the death-censored graft survival for the same grouping levels. Utilizing the five grouping levels as stated previously and Cockcroft-Gault estimated GFR as the measure of renal function the 6-year death-censored survival rates were 84.7%, 82.3%, 77.6%, 60.1%, 45.1%, respectively. After adjusting for the relevant covariates in a Cox model, patients with acute rejection but functional recovery to within 5% from baseline function had a relative risk of 1.046 (0.859, 1.273) for death-censored graft failure and those who failed to reach baseline following acute rejection had a relative risk of 3.077 (2.691, 3.519) for death-censored graft loss when compared with patients without acute rejection. For the five renal function groupings following acute rejection, with the no acute rejection group as the reference, the risk estimates were 1.067 (0.882, 1.291), 1.223 (0.874, 1.713), 2.739 (2.024, 3.705), and 5.130 (4.332, 6.076) (displayed in Table 5). To confirm our findings were applicable in particular substrata, we repeated this portion of the analysis by donor type, racial subsets, and for patients classified by initial GFR function; the results were similar for each of these.

Table 5. Multivariate risk estimates for death-censored graft survival by acute rejection status and functional return to baseline
 Reference group
Acute rejection and return to functional baseline group 
No acute rejectionHazardConfidence interval
  1. Return to baseline function estimated by calculated creatinine clearance (Cockroft-Gault).

Acute rejection and return to within 95% of baseline1.067(0.882, 1.291)
Acute rejection and return to within 85–95% of baseline1.223(0.874, 1.713)
Acute rejection and return to within 75–85% of baseline2.739(2.024, 3.705)
Acute rejection and return to within < 75% of baseline5.130(4.332, 6.076)


As immunosuppressive regimens have evolved, acute rejection rates have progressively decreased after renal transplantation. In addition, both short and long-term graft survival had been improving from the period of 1988 until 1996 (6). The data presented in our study suggest that this improvement in graft survival has not continued in the period of 1995–2000. Overall graft survival and patient survival have remained unchanged since 1995, while death-censored graft survival shows a significant decrease in the same time period. This lack of improvement has been shown in the setting of an almost halving of early and late acute rejection rates during the same time period. This potential discordance between trends in acute rejection rates and trends in long-term graft survival has been observed in several recent clinical trials.

Acute rejection has been used in many studies as the primary endpoint, under the assumption that reduced acute rejection rates would ultimately lead to better graft survival. The data analyzed here would question this assumption and furthermore would imply that the achievement of ever lower rejection rates does not necessarily lead to improved graft survival.

A partial explantation of this apparent paradox is that not all acute rejections are the same. In fact, our data reemphasizes the idea that the functional response of the acute rejection episode to therapy is important in distinguishing between rejection episodes impacting graft survival vs. those that do not. In fact, rejection episodes that did not affect renal function did not seem to have any impact on graft survival. The profound impact of acute rejection episodes that do not return to a functional baseline, on graft survival, could be in part result from the immediate structural damage the rejection episode inflicts on the graft. Additionally, acute rejection episodes that do not respond well to treatment are possible markers for an increased risk for subsequent late rejection. In fact, in our data the repeat acute rejection rate was higher in patients with primary rejection who did not return to baseline, and the poor graft survival rate was probably accentuated by this.

Our study indicated that since 1995 there has been a trend towards fewer rejections returning to baseline function after treatment. That might in fact be part of the observed discordance between rejection rates and graft survival. It is possible that many of the acute rejection episodes that were less severe in terms of effects on functional status are those that were reduced in the more recent era while rejections with stronger functional impact persisted. Previous registry analysis had suggested that in recent era acute rejection episodes had a stronger impact on long-term graft survival potentially secondary to a disproportionate in milder rejection episodes (14). However, it is unlikely that this finding can completely account for the lack of correlation between reduction in acute rejection rates and trend in graft survival. Other potential reasons might include transplantation of higher risk donors and recipients in more recent years (note that an attempt was made to correct for this in the multivariate analysis).

Additionally, other factors affecting graft survival might be changing over time. It is conceivable that while acute rejection is becoming less frequent with more efficacious immunosuppression, the effects of over immunosuppression, like polyomavirus, are becoming more frequent. It is likely that maintenance immunosuppression has also changed during this time period, particularly given the recent emphasis on immunosuppression minimization and withdrawal trials.

In summary, despite impressive reductions in acute rejection rates since 1995, the favorable trend in graft survival observed in previous years has not continued during the same time period. Changes in acute rejection rates do not seem to reliably correlate with the later end point of graft loss. However, as the risk for graft loss is different between acute rejections that lead to functional deterioration as opposed to those that do not, this distinction should be reported when novel therapeutic regimens are studied. The reason for the lack of graft survival improvements in more recent era needs to be investigated in more detail. In this era of more novel therapies, some caution may be needed in extrapolating improvements in intermediate end points and improvements in long-term outcomes.


The data reported here were supplied by the U.S. Scientific Renal Transplant Registry (SRTR). The interpretation and reporting of these data are the responsibility of the authors and in no way represent an official policy or interpretation of the U.S. Government.

Part of this data was presented at the annual meeting of the American Society of Transplantation in Washington 2003.

We would like to express our appreciation to Suzanne C. Johnson who has helped with the editing and reviewing of the paper.