If you can't find a tool you're looking for, please click the link at the top of the page to "Go to old article view". Alternatively, view our Knowledge Base articles for additional help. Your feedback is important to us, so please let us know if you have comments or ideas for improvement.
Liver transplantation (LTx) is an effective therapy of last resort in patients with end-stage liver disease, of whom approximately 40% to 50% have cirrhosis due to infection with hepatitis C virus (HCV). Advances in surgical techniques and perioperative management have greatly improved patient and graft survival post-LTx in the past 2 decades.1
Even though the number of liver transplants performed in the United States has increased in the last years (from 5000 to 6400 between 2000 and 2007) and the number of new registrations on the waiting list has remained relatively constant over the same time frame (10,751 and 11,080 patients, respectively),2 the gap between organ availability and listed patients continues to result in considerable waitlist mortality, keeping the issue of organ allocation at the forefront of discussions within the transplant community. Techniques such as living donor liver transplantation (LDLTx), split or partial LTx, donation after cardiac death, and the utilization of marginal or extended criteria donor grafts have evolved in the past decade to help meet this need. Currently, LDLTx accounts for close to 5% of liver transplants in the United States, and despite the rapid increase in the number of centers performing LDLTx since the late 1990s, the number of procedures performed has declined in the last 3 years, mainly because of donor safety concerns.2
Another accepted practice for increasing the supply of grafts for transplantation is the use of older donors despite a lack of long term data supporting this practice. Indeed, approximately 35% of liver grafts from donors used in 2005 were older than 50 years old,2 and this trend is expected to continue. Although many transplant centers consider older donor age an important risk factor,3 there is no consensus on the effect of advanced age alone or combined with other risk factors such as obesity, cause of death, and long cold ischemia time. Recently, 2 reports4, 5 have attempted to quantify the relative risk (RR) of specific donor characteristics on posttransplant graft outcomes with the main goal of unifying or quantifying the definition of extended criteria donor or marginal donor.
Unfortunately, long-term graft and patient survival in HCV-infected recipients is lower than that in recipients transplanted for other underlying liver diseases.6, 7 Several mechanisms likely contribute to this observation, including rapid disease recurrence and fibrosis progression,8, 9 an increased risk of infection,10, 11 and an increased risk of nonhepatic diseases such as diabetes mellitus12 and renal insufficiency,13 all of which occur more frequently in HCV-infected patients than in patients with other primary liver diseases. Therefore, we hypothesized that HCV-infected transplant recipients may be particularly vulnerable to the added adverse effects of receiving a graft from a nonoptimal donor and may have reduced graft and patient survival post-LTx in excess of those receiving standard grafts.14–16 The current work extends our recent observation17 that the use of grafts with a high donor risk index (DRI) adversely affects patient and graft survival for LTx recipients.
CI, confidence interval; CNS, central nervous system; COPD, chronic obstructive pulmonary disease; DRI, donor risk index; HCC, hepatocellular carcinoma; HCV, hepatitis C virus; ICU, intensive care unit; LDLTx, living donor liver transplantation; LTx, liver transplantation; MELD, Model for End-Stage Liver Disease; NS, not significant; OPTN, Organ Procurement and Transplantation Network; RR, relative risk; SRTR, Scientific Registry of Transplant Recipients; TIPS, transjugular intrahepatic portosystemic shunt.
PATIENTS AND METHODS
Donor and recipient data from the Organ Procurement and Transplantation Network (OPTN) database as of March 7, 2008 were used for the analysis. The cohort consisted of adult liver allograft recipients transplanted between January 1, 2000 and June 30, 2006. Patients with repeat transplants, multiorgan transplants, and pediatric transplants (<18 years old) were excluded from the analysis. In addition, status 1 recipients and recipients with the diagnosis of hepatocellular carcinoma or any approved Model for End-Stage Liver Disease (MELD) exception at the time of transplant were excluded; this left a total of 20,317 transplants available for analysis. Recipients were considered to have HCV-induced liver disease on the basis of serological results because the results of HCV antibody or recombinant immunoblot assay tests, but not HCV RNA results, were submitted to the OPTN at the time of transplant. After the exclusion of those cases in which the HCV serology results were not available or equivocal, the final cohort consisted of 16,678 adult LTx recipients. To ensure that the exclusion of these patients did not bias our results, we examined the characteristics of the recipients with and without HCV data. The 2 groups (HCV serology available/HCV serology unavailable) were similar in terms of median age (52 years/52 years), gender (66.7%/65.2% male), African American ethnicity (7.8%/7.2%), and median DRI (1.33/1.41). Patient survival was also similar between the 2 groups (87.6%/79.9%, 73.4%/87.0%, and 78.2%/73.0%, respectively, at 1, 3 and 5 years).
The DRI was calculated as described by Feng et al.4 The DRI was derived from a retrospective analysis of donor characteristics predictive of liver allograft failure from the Scientific Registry of Transplant Recipients (SRTR). Donor characteristics associated with significantly higher rates of graft failure included 3 demographic characteristics (age over 40 years and particularly over 60 years, African American race, and shorter stature), 3 characteristics of the cause and type of donor death (donation after cardiac death, death by cerebrovascular accident, and death other than trauma, stroke, or anoxia), and LTx with a split or partial graft.
Our model allowed the DRI to float through the entire range within our sample of donors and tested its impact on HCV(+) and HCV(−) recipients at every level. Moreover, the Wald test was used to test the effect of the recipient MELD score at transplant on the DRI–HCV interaction.
RR estimates were obtained from Cox regression models of time to graft failure or time to death after transplant. The models contained several factors, including DRI, recipient HCV status and other diagnoses, recipient MELD score at transplant, age, gender, ethnicity, comorbidities, and year of transplant. Additionally, the models included interaction effects between the recipient hepatitis C status and DRI. Comorbid factors were analyzed to determine if they had an impact on outcome after adjustments for the DRI and recipient factors listed previously, including ascites, bacterial peritonitis, symptomatic cerebrovascular disease, diabetes, dialysis at transplant, drug-treated chronic obstructive pulmonary disease, drug-treated hypertension, peptic ulcer disease, peripheral vascular disease, portal vein thrombosis, gastrointestinal bleeding, previous abdominal surgery, previous malignancies, variceal bleeding, the presence of a transjugular intrahepatic portacaval shunt, and the discovery of an incidental tumor at transplant. In the survival analysis, deaths were counted as graft failures. For living recipients, survival was censored at the time of the latest follow-up available to the OPTN (March 7, 2008). Adjusted survival rates from the Cox models were calculated under the assumption of a hospitalized patient, a transplant year of 2006, and all other factors set to their reference values. For continuous variables, observations with missing data were set to the median value. For categorical variables, observations with missing data were set to the model category. All statistical calculations were performed with SAS for Windows, version 9.1, and R for Windows, version 2.4.1.
Data analysis was performed for 16,678 adult LTx recipients and their donors between January 1, 2000 and June 30, 2006. Donor and recipients demographics are presented in Table 1. Fisher's exact test was used to compare them, and they are displayed in Table 1. Moreover, all factors were controlled by inclusion in the analysis.
Recipient demographics included Caucasian race (76.1%), male sex (66.7%), and a median age of 52 years (range, 18–80 years). Forty-six percent of the recipients (n = 7675) were HCV(+) by serology, and 9003 were HCV(−). Nonalcoholic cirrhosis was reported in 41.6% of recipients, alcoholic cirrhosis was reported in 23.3%, primary biliary cirrhosis was reported in 5.4%, primary sclerosing cholangitis was reported in 5.9%, acute hepatic necrosis was reported in 5.1%, and other etiologies were reported in about 17.7%. The median DRI in our donor cohort was 1.3 (range, 0.77 to 4.27), and the median follow-up time was 1081 days (range, 0–2959 days), with 4889 total graft failures and 3745 deaths reported.
Preliminary models containing all covariates were fit without the HCV-DRI interaction term. A positive HCV serology had a statistically significant impact on the risk of graft failure (RR = 1.36, P < 0.0001) and death (RR = 1.45, P < 0.0001) in these models. We then fit models using the same covariates to examine the interaction between HCV and DRI.
Figure 1A shows the effect of DRI and HCV status in the Cox regression model of the RR of graft failure after transplantation. An increasing DRI was associated with a statistically significant increase in the RR of failure for both HCV(+) and HCV(−) recipients. However, the curves diverge at a low DRI (∼1.5), and the RR of graft failure increases to a far greater extent in HCV(+) recipients than in HCV(−) recipients, with a statistically significant interaction between DRI and HCV (P = 0.004). At a DRI of 3.0 versus a DRI of 1.0, for example, the estimated RR of graft failure for HCV(+) recipients was 4.12 [95% confidence interval (CI), 3.41–4.97] versus 2.8 (95% CI, 2.32–3.38) for HCV(−) recipients (Table 2A). Figure 1B demonstrates a similar effect of HCV status on the RR of patient death after transplant. A DRI of 3.0 versus a DRI of 1.0 was associated with a 3.3-fold RR of death in HCV(+) recipients but only a 1.7-fold RR of death in HCV(−) recipients (Table 2B). Again, Cox regression modeling indicated a statistically significant interaction between DRI and HCV (P = 0.0002).
Table 2. Relative Risks
A. Relative Risk of Graft Failure by DRI and HCV Status
Relative Risk (95% CI)
NOTE: Data are adjusted for the recipient Model for End-Stage Liver Disease score at transplant, age, gender, ethnicity, year of transplant, diabetes, dialysis, history of portal vein thrombosis, previous abdominal surgery, previous malignancy, transjugular intrahepatic portosystemic shunt, and incidental tumor found at transplant.
Adjusted graft and patient survival rates from Cox models are depicted in Figs. 2 and 3, respectively. After adjustment for the recipient MELD score at transplant, age, gender, ethnicity, year of transplant, diabetes, dialysis, portal vein thrombosis, previous abdominal surgery, previous malignancy, presence of transjugular intrahepatic portosystemic shunt, and the discovery of an incidental tumor at transplant, a significant interaction between HCV and DRI remains, indicating that the negative impact of HCV on outcomes is enhanced with increasing DRI. As shown in Fig. 2, for example, when the DRI was 1.4, the expected 3-year graft survival rate was 78.5% for HCV(−) recipients versus 72.3% for HCV(+) recipients (absolute difference, 6.2%). Similarly, at a DRI of 2.8, the expected 3-year graft survival rate was 60.8.% for HCV(−) recipients versus 41.7% for HCV(+) recipients (absolute difference, 19.1%). The recipient MELD score at transplant was not found to significantly affect the DRI-HCV interaction for graft survival (P = 0.517) or patient survival (P = 0.778; Wald test).
An important component of the DRI, donor age, has previously been shown to adversely affect outcomes of HCV(+) recipients. Therefore, Cox modeling was also used to compare outcomes according to HCV status with donor age alone versus the full DRI. Consistent with previous reports, donor age was the predominant component of the DRI predicting adverse outcome in LTx recipients and accounted for 70% of the relationship, whereas other components accounted for the remaining 30%.
To investigate this further, first we consider our base graft survival model with all other factors except donor age and DRI. Then, to this model, we add either donor age or DRI. Adding donor age to our base model for graft survival increases the model log likelihood by 98.34. On the other hand, adding the factor for DRI to our base model increases the log likelihood by 147.9. This indicates that DRI provides a better fit to our data than donor age alone.
Similarly, for patient survival, adding donor age to our base model for patient survival increases the model log likelihood by 46.26. Adding DRI to our base model increases the model log likelihood by 52.65. Again, this indicates that DRI provides a better fit to our data than donor age alone.
Of the comorbidity factors evaluated, the presence of diabetes, dialysis, a history of portal vein thrombosis, previous abdominal surgery, previous malignancy, and an incidental tumor discovered at transplant also had a statistically significant impact on outcome (either graft or patient survival; Table 3). In contrast, female gender had no significant effect on the RR of graft failure (RR = 1.01, P = 0.269) or death (RR = 0.97, P = 0.820). Significantly different causes of death reported to the OPTN database for HCV(+) and HCV(−) recipients included hepatitis (12.75% versus 0.73%, P < 0.0001), graft failure (6.15% versus 8.39%, P = 0.008), cardiovascular events (10.82% versus 14.58%, P = 0.0005), infection (14.58% versus 17.45%, P = 0.016), and malignancy (8.02% versus 10.70%, P = 0.005).
Table 3. Cox Model Results for Factors Other than the Donor Risk Index and Hepatitis C Virus
We also tested this interaction (HCV-DRI) by excluding patients that died in the first 3 months post-transplantation and found no differences in comparison with the group used in our formal analyses.
HCV infection is the single most common cause of end-stage liver disease and need for LTx in the United States.2 Unfortunately, posttransplant reinfection of the allograft, which occurs universally, leads to aggressive hepatitis and fibrosis in many patients, and 5- and 10-year posttransplant survival for HCV(+) liver recipients is significantly lower than that for HCV(−) recipients.7–9 In data from the OPTN database, Forman et al.7 demonstrated that LTx in HCV-positive recipients was associated with an increased rate of death (hazard ratio, 1.23; 95% CI, 1.12–1.35) and allograft failure (hazard ratio, 1.30; 95% CI, 1.21–1.39) in comparison with transplantation in HCV-negative recipients. Our analysis supports this finding and also identifies an negative interaction between recipient HCV serology and increasing DRI, such that, for a given DRI, the risk of graft failure (P < 0.0001) and death (P < 0.0001) is increased in HCV(+) patients compared to HCV(−) patients. Cirrhosis has been reported in up to 30% of HCV(+) LTx recipients within 5 years,8 and once cirrhosis is established, 40% develop liver decompensation within 1 year.18, 19 Most series suggest that antiviral therapy for HCV recurrence is substantially less effective and less well tolerated than in the nontransplant population,20 and survival after retransplantation for graft cirrhosis has been shown to be poor, particularly in patients with renal failure.21 These facts mandate that all measures to optimize outcomes in HCV-infected recipients be considered at transplant time. Indeed, potentially modifiable risk factors for HCV(+) LTx recipients, including pretransplant recipient factors, donor factors, viral factors, and posttransplant factors, are difficult to manipulate.20, 22
The choice of a liver allograft with attributes predicting an adverse long-term outcome offers a unique opportunity to change the natural history of the HCV-infected recipient. However, waiting candidates and their doctors are offered grafts one by one. They do not have the opportunity to choose from a group of grafts at any one time. Offers must be accepted or rejected on the basis of not only the likelihood of success with that graft but also the likelihood that another, better offer will come in the future and that the refusing candidate will still be alive and able to undergo transplantation if and when that subsequent offer comes.
There is no consensus on the definition of a marginal or extended criteria donor. The DRI was derived from a retrospective analysis of donor characteristics predictive of liver allograft failure from the SRTR. Donor characteristics associated with significantly higher rates of graft failure included 3 demographic characteristics (age over 40 years and particularly over 60 years, African American race, and shorter stature), 3 characteristics of the cause and type of donor death (donation after cardiac death, death by cerebrovascular accident, and death other than trauma, stroke, or anoxia), and LTx with a split or partial graft. However, other risk factors such as graft steatosis, high vasopressor drug requirement, and other clinical attributes of the donor must also be carefully considered by the transplant physician and balanced against the risk of recipient mortality while the patient continues on the waiting list in the event that the offer of a graft is declined.
Advanced donor age has been recognized as a predictor of adverse outcome in HCV(+) recipients compared to those without HCV.3, 23, 24 However, other donor characteristics identified by Feng et al.4 and included in the DRI contribute to a poorer outcome for LTx recipients, regardless of etiology. Our observations are the first to suggest that the higher DRI incrementally increases the risk of graft loss and patient death to a greater extent in HCV(+) recipients than in HCV(−) recipients. Although the effect is largely attributable to advanced donor age, other components of the DRI also contribute to the effect, and this suggests that the DRI may be a more accurate predictor of adverse outcome than age alone. We observed that in our model, when donor age alone was added to our base model for graft survival, the model log likelihood increased by 98.34. On the other hand, we observed that adding the factors for the DRI (or the DRI) to our base model increased the log likelihood by 147.9. This indicates that the DRI provides a better fit to our data than donor age alone. Similar results were observed for patient survival. Again, this indicates that the DRI provides a better fit to our data than donor age alone in predicting donor quality.
Several studies have suggested that the rate of fibrosis progression after LTx in HCV(+) recipients is proportional to donor age,24–26 and this is a likely contributory explanation for the observations noted in the present study. One analysis of the SRTR reported that the risk of premature graft loss begins with donors > 40 years of age, with hazard ratios for graft loss of 1.67 (95% CI, 1.34–2.09) for donors 41 to 50 years of age, 1.86 (95% CI, 1.48–2.34) for donors 51 to 60 years of age, and 2.21 (95% CI, 1.73–2.81) for donors > 60 years of age in comparison with recipients without viral hepatitis.23 Indeed, the recognition of the adverse effect of older donor age on the post-LTx course of HCV(+) recipients may have already changed clinical practice, as the proportion of HCV(+) recipients receiving a high-DRI graft in our analysis was lower than that of HCV(−) recipients [8% of HCV(+) recipients received a graft with DRI ≥ 2, whereas 10.4% of HCV(−) recipients received a graft with DRI ≥ 2; data not shown]. Interestingly, older donor age has not been shown to predict adverse outcome in patients transplanted for chronic hepatitis B,23 presumably because graft reinfection with hepatitis B virus is now rare since the introduction of hepatitis B immune globulin and nucleoside analogue prophylaxis.27
Recipient factors also play an important role in the posttransplant course of recurrent hepatitis C. Indeed, the rate of fibrosis progression in an individual transplant recipient with recurrent HCV is highly variable, with some patients enjoying a relatively indolent course.11 Rapid fibrosis progression after LTx has been correlated with recipient characteristics such as treatment of acute rejection (with steroids or anti-lymphocyte therapies), cytomegalovirus infection, and pretransplant HCV viral load.28–30
Our analysis identified other recipient cofactors such as diabetes, dialysis, portal vein thrombosis, previous abdominal surgery, previous malignancy, and incidental tumors with statistically significant adverse impact on recipient outcome. These findings are consistent with previous studies that have identified HCV-induced, nonhepatic complications as significant sources of morbidity after LTx.10–13
The effect of donor gender and donor-recipient gender mismatch on recipient outcome remains controversial.31 In contrast to other analyses, we found no significant effect of donor gender on recipient outcomes, regardless of the HCV serology.
There are limitations to this study. First, studies using large electronic databases are prone to misclassification bias. Second, we defined the HCV(+) group only by HCV serology, which was available from the OPTN/United Network for Organ Sharing database, rather than HCV RNA (which proves active infection), which is not captured. Although we performed an analysis to ensure that the exclusion of patients without available HCV serology did not change our conclusions, undoubtedly some patients with HCV antibodies were not actively infected with HCV as a result of spontaneous clearance of the virus, false positive serologies, or successful antiviral therapy. However, the number of HCV antibody–positive, HCV RNA–negative patients undergoing LTx is likely to be small, and their inclusion in our HCV(+) group would likely decrease the magnitude of our results.
In summary, we identified a synergistic, adverse interaction between donor DRI and recipient HCV status that increased risk of graft and patient loss in HCV(+) LTx recipients per unit increase of the DRI. This observation suggests that mainly donor age, but also circumstances of the donor's presentation as defined by the DRI, affect long-term transplant outcome. High-DRI grafts should be used carefully in HCV(+) patients. However, even though MELD does not affect this interaction, the high risk of mortality in recipients with high MELD scores (>20) may still justify the use of a high-DRI graft.
We dedicate this article to the memory of our senior author and colleague, Dr. H. Myron Kauffman, whose constant enthusiasm and support enlightened the field of organ transplantation.