- Top of page
- Materials and Methods
Due to increasing use of allografts from donation after cardiac death (DCD) donors, we evaluated DCD liver transplants and impact of recipient and donor factors on graft survival. Liver transplants from DCD donors reported to UNOS were analyzed against donation after brain death (DBD) donor liver transplants performed between 1996 and 2003. We defined a recipient cumulative relative risk (RCRR) using significant risk factors identified from a Cox regression analysis: age; medical condition at transplantation; regraft status; dialysis received and serum creatinine. Graft survival from DCD donors (71% at 1 year and 60% at 3 years) were significantly inferior to DBD donors (80% at 1 year and 72% at 3 years, p < 0.001). Low-risk recipients (RCRR ≤ 1.5) with low-risk DCD livers (DWIT < 30 min and CIT < 10 h, n = 226) achieved graft survival rates (81% and 67% at 1 and 3 years, respectively) not significantly different from recipients with DBD allografts (80% and 72% at 1 and 3 years, respectively, log-rank p = 0.23). Liver allografts from DCD donors may be used to increase the cadaveric donor pool, with favorable graft survival rates achieved when low-risk grafts are transplanted in a low-risk setting. Whether transplantation of these organs in low-risk recipients provides a survival benefit compared to the waiting list is unknown.
- Top of page
- Materials and Methods
Worldwide, limitations on the availability of suitable donor organs continue to adversely affect mortality rates in candidates on the waiting list for organ transplantation. Methods and techniques for expanding the organ donor pool for liver transplantation include the use of split and living donor allografts, extended criteria or marginal donors and non-heart beating/donation after cardiac death (DCD) donors, while multiple incentives to improve living and cadaveric donation rates are under study (1). Grafts from DCD donors are procured after cessation of cardiopulmonary function in the donor, and can occur in a controlled setting, after a planned withdrawal of life support, or in an uncontrolled situation with the onset of sudden cardiac arrest. Early reports of single-center experience in the use of liver allografts from DCD donors provided evidence for acceptable outcomes. Although liver, kidney, lung (2) and pancreas (3) grafts have been procured from DCD donors and successfully transplanted, long-term graft and patient survival rates from these procedures have been reviewed only recently. We examined the UNOS database with the objective of determining risk factors affecting graft survival rates after liver transplantation from the use of DCD allografts.
Materials and Methods
- Top of page
- Materials and Methods
From January 1996 to December 2003, 367 hepatic allografts from DCD donors were reported to the United Network for Organ Sharing (UNOS) database, which comprised the study group. The control group comprised 33 111 heart-beating cadaveric/donation after brain death (DBD) donor liver transplants during the same period. Partial or split-liver transplants from cadaveric donors were excluded from this study.
Donor warm ischemia time (DWIT) is determined from the time of cessation of cardiopulmonary support with loss of hemodynamic and respiratory function up to the in situ initiation of cold preservation solution through the aorta. Cold ischemia time (CIT) is defined from the time of cold perfusion of the organ in the donor to the time of warm reperfusion in the recipient.
The recipient medical condition at the time of transplant was assigned into three groups: at home or nonhospitalized patients; hospitalized patients (due to complications of cirrhosis or portal hypertension) or patients in the intensive care unit (ICU) but not on life support or patients on life support. Maintenance of life support included the use of a ventilator, extra-corporeal membrane oxygenation, intra-aortic balloon pump, or intravenous inotropes to maintain a recipient's vital signs.
Primary graft failures occur after allograft loss from non-extrinsic causes, and include vascular etiologies (hepatic artery or portal/hepatic vein thrombosis) as well as primary nonfunction (nonachievement or loss of synthetic function). Characteristics examined between DBD and DCD donors were age, sex, cause of death, terminal biochemistries (creatinine, bilirubin, AST, ALT, alkaline phosphatase), DWIT and graft CIT. Recipients of grafts from DBD and DCD donors were evaluated on age, sex, hospitalization status at the time of transplant, regraft status, hepatic chemistries, albumin, serum creatinine, requirement for dialysis and prothrombin time (PT).
Graft survival rates were determined using the Kaplan-Meier product limit method. The log-rank test was used for comparison between two survival curves. The Wilcoxan rank-sum test was used to compare continuous variables, and the chi-square test was used to compare categorical variables. Variables that significantly influenced graft loss under univariate analysis were included in a multivariate Cox regression analysis. Missing data in our analyses were imputed with modal values for categorical variables and mean values for continuous variables. Less than 5% of values were missing for any covariate. Continuous variables such as age, serum creatinine level and warm and cold ischemia time were categorized, since their effects on the hazard function were nonlinear (data not shown). Two dichotomous variables were used in a Cox regression analysis to address the DWIT with three categories: 0 min for DBD; 1–30 min and >30 min for DCD. Similarly, graft CIT was divided into being greater than or less than 10 h. Each variable was also categorized in order to adjust for its nonlinear effect on the regression model. Allografts were classified as low risk when both the DWIT was ≤30 min and CIT was ≤10 h, while high-risk grafts were defined as having a DWIT >30 min and/or a CIT >10 h. p-Value less than 0.05 was considered as statistically significant. All reported p values were 2-tailed.
In the Cox proportional hazards model, the hazard is assumed to be and represented as h(t) = h0(t) exp(a1x1+…+ akxk), with exp() indicating the exponential function, h0(t), as the baseline hazard at post-transplant time t, ai ith regression coefficient and xi ith dichotomous covariate (either 0 or 1), with i = 1,2,3, … ,k. We defined a recipient cumulative relative risk (RCRR) based on the results of the multivariate Cox regression analysis (Table 2) as follows, RCRR = exp(a1x1+…+ a6x6): x1= 1 if age >60 and 0 elsewhere; x2= 1 if recipient was hospitalized or in an ICU but not on life support and 0 elsewhere; x3= 1 if recipient was on life support and 0 elsewhere; x4= 1 if recipient had previous transplant received and 0 elsewhere; x5= 1 if dialysis was received and 0 elsewhere; x6= 1 if serum creatinine value >2.0 mg/dL and 0 elsewhere, i = 1, … ,6. As an example, if a recipient is older than 60 years, is on life-support, received a previous transplant and is on dialysis, with a serum creatinine >2.0 mg/dL, his RCRR = exp(0.155 + 0.434 + 0.612 + 0.233 + 0.208) = 5.17 which is the maximum value of RCRR, whereas the minimum value is 1.0 (baseline, i.e. no risk) in our study. According to the RCRR score, a recipient was classified as a low-risk recipient if the RCRR ≤1.5 or a high-risk recipient if the RCRR >1.5.
Table 2. Risk factors for graft loss using the Cox regression model
|Variables||Regression coefficient||Relative risk (95% CI)||p-Value|
| Age (year)|
| <35||0||1.0|| |
| 36–50||0.157||1.17 (1.11–1.24)||<0.001|
| 51–60||0.324||1.38 (1.30–1.48)||<0.001|
| >60||0.482||1.62 (1.51–1.73)||<0.001|
|Cause of death|
| CVA/stroke||0.137||1.15 (1.09–1.20)||<0.001|
| Others||0||1.0|| |
|Warm ischemia time (min)|
| 0 (HBD)||0||1.0|| |
| 1–30||0.595||1.81 (1.51–2.18)||<0.001|
| >30||0.850||2.34 (1.26–4.35)||0.007|
|Cold ischemia time (h)|
| 0–10||0||1.0|| |
| >10||0.167||1.18 (1.13–1.24)||<0.001|
|Recipient risk factors|
| <60||0||1.0|| |
| >60||0.155||1.17 (1.15–1.32)||<0.001|
| Life-support||0.434||1.54 (1.44–1.65)||<0.001|
| Hospital/ICU||0.177||1.19 (1.14–1.25)||<0.001|
| Not in hospital||0||1.0|| |
| No||0||1.0|| |
| Yes||0.612||1.84 (1.74–1.95)||<0.001|
| No||0||1.0|| |
| Yes||0.233||1.26 (1.16–1.37)||<0.001|
|Creatinine@TX > 2.0 mg/dL?|
| No||0||1.0|| |
| Yes||0.208||1.23 (1.16–1.31)||<0.001|
- Top of page
- Materials and Methods
Overall graft survival rates from DCD donors (71% at 1 year and 60% at 3 years) were significantly inferior to those from DBD donors (80% at 1 year and 72% at 3 years, p < 0.001) (Figure 1).
Figure 1. Overall graft survival from DCD and DBD donors. Numbers within parentheses indicate the numbers of patients at risk at each follow-up time (3-, 6-month, 1-, 2-, 3-, 4- and 5-year post-transplant).
Download figure to PowerPoint
In Table 1, patients receiving DCD grafts were significantly older than recipients from the DBD group (50.6 ± 11.7 vs. 47.1 ± 15.3, p < 0.001). The total bilirubin was lower in DCD graft recipients (6.5 ± 9.8 vs. 7.6 ± 10.3 mg/dL, p = 0.003) while both recipient groups had comparable AST, ALT, alkaline phosphatase, serum albumin, PT and serum creatinine levels. The fraction of female recipients was slightly lower in DCD group compared with that of DBD group. The majority of recipients from both DBD and DCD groups were nonhospitalized at the time of transplantation, although a greater percentage was present in the latter (62% vs. 70%, p = 0.04). No differences were noted between recipient groups for donor age, donor sex, previous transplantation and CIT. Significantly lower fractions of cerebrovascular/stroke death and African American donor were noted in DCD group compared with those of DBD group.
Table 1. Characteristics of donors, recipients and grafts
| Age (year)||35.3 ± 16.7||36.8 ± 18.8||0.14|
| Female (%)||37||41||0.16|
| CVA/Stroke death (%)||20||42||<0.001|
| Warm ischemia time (min)||15.6 ± 10.4||0||<0.001|
| Age (year)||50.6 ± 11.7||47.1 ± 15.3||<0.001|
| Female (%)||32||37||0.05|
| Previous transplant (%)||9||10||0.49|
|Medical condition @TX (%)||0.004|
| ICU||15||21|| |
| Hospitalized||15||17|| |
| Not hospitalized||70||62|| |
| On life-support @TX (%)||9||10||0.71|
| Total bilirubin (mg/dL)||6.5 ± 9.8||7.6 ± 10.3||0.003|
| SGOT/AST (U/L)||397 ± 1568||285 ± 1038||0.77|
| SGPT/ALT (U/L)||268 ± 974||208 ± 717|| |
| Serum albumin (mg/dL)||2.9 ± 0.7||2.9 ± 0.8||0.80|
| Alkaline phosphate (U/L)||248 ± 346||228 ± 259||0.66|
| Serum creatinine (mg/dL)||1.3 ± 1.1||1.3 ± 1.1||0.14|
| Prothrombin time (s)||16.4 ± 4.6||17.1 ± 6.3||0.63|
| Prothrombin control (INR)||1.8 ± 1.3||1.9 ± 2.1||0.30|
| Cold ischemia time (h)||8.3 ± 3.2||8.4 ± 4.1||0.78|
Table 2 shows significant risk factors for graft loss based on a multivarivate Cox regression analysis. Among the recipient risk factors, a history of a previous liver transplant (RR = 1.84 of regraft vs. primary, p < 0.001), being on life-support (RR = 1.54 vs. not life-supported), being hospitalized or in an ICU (RR = 1.19 vs. others, p < 0.001), having received dialysis (RR = 1.26 vs. no dialysis, p < 0.001), having a serum creatinine value >2.0 mg/dL (RR = 1.23 vs. ≤ 2.0, p < 0.001) and age greater than 60 years (RR = 1.17 vs. age <60 years, p < 0.001), had deleterious effects on graft survival after adjusting for all other factors. Regression coefficients associated with these risk factors subsequently classified a recipient as low risk (RCRR ≤ 1.5) or high risk (RCRR > 1.5). When liver transplants were divided into three groups according to the DWIT (DBD group, 1–30 min, >30 min), a statistically significant stepwise increase in the relative risks of graft loss was noted (RR = 1.81 of 1–30 min vs. DBD and RR = 2.34 of >30 min vs. DBD). This negative effect is also demonstrated for grafts exceeding a 10 h CIT (RR = 1.18 of CIT >10 h vs. ≤10 h, p < 0.001).
A correspondingly detrimental effect was observed on graft survival according to increasing RCRR scores of recipients who received DBD liver grafts. In the DCD group, a detrimental effect on graft survival is noted when the recipient RCRR score was greater than 1.5 (data not shown). We divided DCD donor livers into low-risk grafts, with DWIT ≤30 min and CIT ≤10 h (n = 273) and high-risk grafts (n = 94). In Figure 2, when low-risk recipients received low-risk DCD liver grafts (Group 1, n = 226), their graft survival rates (81% and 67% at 1 and 3 years, respectively) were comparable to those of the DBD group (80% and 72% at 1 and 3 years, respectively, log-rank p = 0.23). When high-risk DCD grafts were transplanted into low-risk recipients (Group 2, n = 76), graft survival was significantly lower when compared against DBD grafts (64% and 57% at 1 and 3 years, p = 0.004). High-risk recipients who received either low-risk (Group 3, n = 47) or high-risk (Group 4, n = 18) DCD livers yielded statistically inferior graft survival rates (46% and 35% at 1 and 3years of Group 3, and 47% and 35% at 1 and 3 years of Group 4, respectively) compared with recipients of DBD livers (p < 0.001 for both).
Figure 2. Graft survival of low- and high-risk recipients receiving low- or high-risk liver allografts. Group 1 indicates low-risk recipients who received low-risk grafts (CIT < 10 h and DWIT < 30 min); Group 2 low-risk recipients who received high-risk grafts (CIT > 10 h or DWIT > 30 min); Group 3 high-risk recipients who received low-risk grafts and Group 4 high-risk recipients who received high-risk grafts. Numbers within parentheses indicate the numbers of patients at risk at each follow-up time (3-, 6-month, 1-, 2-, 3-, 4- and 5-year post-transplant). Open and closed figures represent low- and high-risk recipients, respectively. Squares and circles depict low- and high-risk donors, respectively. DBD donors are represented by triangles.
Download figure to PowerPoint
- Top of page
- Materials and Methods
Successful use of hepatic allografts from DCD donors was reported by many groups prior to the institution of brain death criteria (4). In contrast to the Netherlands, where a concurrent increase in the annual rate of DCD donors has been observed with a decrease in DBD donation rates since 1995 (5), the annual donation rates for both DBD and DCD donors have improved in the United States, with the number of liver allografts obtained from DCD donors progressively increasing from 11 in 1996 to 111 in 2003 (Table 3). Preliminary analysis of DCD liver allografts obtained prior to 1996 (n = 18) revealed graft and patient survival rates which were significantly inferior and uncharacteristic of the subsequent study patients (data not shown), possibly due to the initial unfamiliarity and reluctance with DCD, as well as the learning curve involved in the use of these grafts. Thus, analysis for the study utilized data obtained from transplants performed after 1995. Early experiences in the use of livers procured from DCD donors revealed high rates of graft primary nonfunction as well as hepatic artery thrombosis (6,7). Despite increasing clinical experience with these donors, a more recent analysis of graft survival rates for livers obtained from DCD donors between 2000 and 2003 revealed an adjusted hazards ratio of 1.85 for graft loss when compared against DBD allografts used during the same time period (8), and high primary non-function rates (9) as well as biliary tract complications (10) have been associated with the use of allograft livers from DCD donors. Both complications have been attributed to prolonged ischemia times, which were more common in DCD grafts procured early in the overall experience, and mirrors observations found under similar conditions from DBD donors (11,12).
Table 3. Annual trend of DBD and DCD donor liver transplants in the United States
Combined thermal and ischemic effects on allograft function are well described (6), and our conclusions on the effects of prolonged ischemia time on graft survival are in agreement with the previous results (13). In its ascribed form, the RCRR would be >1.5 in a candidate with a previous transplant or requiring life support. Thus, the presence of either of these risk factors would statistically result in an inferior graft survival, and would eliminate a prospective recipient from further consideration for a DCD graft. A combination of three or more of the remaining risk factors, the presence of a serum creatinine >2.0 mg/dL despite hemodialysis (RCRR = 1.55), or being hospitalized or in an ICU with hemodialysis (RCRR = 1.51) would also disfavor a candidate from receiving a DCD graft on the basis of the RCRR score. Clinical application of the RCRR is thus reduced to a simplified algorithm which selects favorable candidates for a DCD graft based on an RCRR score <1.5 (Figure 3) and defers those candidates with any other configuration of these risk factors. Recipients with fewer risk factors receiving DCD livers with minimal ischemia times achieve graft survival rates not significantly different to those obtained from DBD donors, while transplantation of DCD livers involving high-risk recipients and/or grafts result in significantly inferior long-term outcomes.
Findings in our study parallel recent reports of minimal DWIT or CIT under 10 hours (14,15) abrogating the effects of ischemic damage in grafts obtained from older DCD donors under a controlled setting. Since outcomes for DCD graft survival appear to be influenced primarily by recipient risk factors, however, livers from DCD donors transplanted into high-risk recipients fared poorly independent of the allograft quality. These results would implicate the optimal use of DCD grafts into recipients with RCRR's <1.5, and that a separate and/or directed allocation system by UNOS for placement of DCD grafts into lower risk candidates may be warranted. DCD livers could be offered within the general donor pool and be acknowledged as potentially resulting in a higher rate of PNF or biliary strictures, which then allows each program to individually determine the suitability of a particular DCD donor for an intended recipient based on an RCRR score. Candidates would be informed of and counseled in expected results and potential complications from the use of DCD grafts. If an informed consent were granted, these candidates would be addressed separately, similar to the list for kidney transplant candidates willing to accept extended criteria donor grafts.
Although the model for end-stage liver disease (MELD) score was originally conceived and designed as a predictor of 3-month survival in chronic liver disease patients, pre-transplant MELD scores have only recently been correlated with survival in post-transplant recipients (16,17). If the MELD score is assumed to adequately and appropriately reflect the physiological status of the candidate, correlation between the RCRR and the (currently available) MELD scores would predict a probable congruence between high-risk recipients as defined by the RCRR, and candidates with MELD scores above 30, reflected in the most significant differences in graft survival between recipients with MELD scores at or below 30 versus those with scores above 30. This is further suggested by similar 1-year graft survival rates between patients with MELD scores ≤30 and those within a low-risk category. Graft survival curves between study groups subsequent to this time point are essentially parallel and are presumably independent of donor graft influence (data not shown). However, while optimal results were demonstrated with the use of low-risk DCD liver allografts in lower risk recipients, the long-term survival benefit must be weighed against the risk of increased mortality when compared against ‘low risk’ candidates who remained on the waiting list. Merion et al. analyzed candidates stratified by their MELD scores and found an increased mortality risk in recipients transplanted with MELD scores <17 (HR = 1.21 for MELD scores = 15–17, p = 0.41, with statistical significance achieved for MELD scores below 15) when compared to similar low MELD candidates who remained on the waiting list (18). Thus, information on the distribution of MELD scores among low-risk candidates is necessary and requires further study, although conclusions from current observations would be premature, as aggregate recipient MELD scores in candidates who received DCD allografts since the implementation of recipient pre-transplant MELD scores in 2002 are too few in number for meaningful analysis.
Over half the livers used from the DCD pool fulfilled the criteria for being low risk, and the impact on the overall number of additional donors per year would be more than modest. Given the relatively constant annual rate of (cadaveric) organ donation, the greatest potential for increased utilization of grafts obtained from DCD would be present from cardiac deaths resulting from an uncontrolled setting, particularly those resulting after a failed resuscitation (Maastricht category 2) (19). In a small trial, under appropriate conditions for physiological maintenance, an 83% graft survival was achieved in livers transplanted from these particular donors (20). With data accumulated between 1997 and 2002 from less than a third of all trauma centers in the United States, the National Trauma Data Bank reports an average of over 5000 annual emergency room fatalities attributable to trauma, of which over 70% are in patients under the age of 65 years (21). The opportunity thus exists for greater physician involvement and incentives for inquiring on the suitability for donation in these patients.
A distinction between controlled and uncontrolled cardiac death donors was de-emphasized in this report so as not to eliminate or diminish contributions of the latter from the overall analysis, as short ischemia times can be achieved in this setting also. The dichotomy between a planned or expected outcome versus a sudden event becomes less functional once analysis of the critical variable (control and minimization of allograft ischemia) is applied to both populations and examined as a phenomenon independent of the terminal process.
In need of further clarification or standardization, however, is the definition of donor ‘warm ischemia time,’ as center-dependent variations exist on the time of its initiation (subsequent to declaration of death, post-withdrawal of life support (22), after a specified period of time during sustained hypotension or arrhythmia (23), or with loss of vital signs. Caution must thus be exercised in interpreting any predictive factors based on less than well-defined variables. A designation from the loss of tissue perfusion to the initiation of cold (intra-aortic) perfusion appears appropriate, as even low perfusion pressures exert a sustaining effect on the graft (20). Defined in this manner, a warm ischemia time of 30 min is not uncommon, especially in the setting of an uncontrolled DCD donor.
The availability and use of a valid predictor for graft function supports and justifies the use of DCD donors in liver transplantation by increasing the donor pool without compromising graft (and presumably patient) survival. Inevitable DWIT and subsequent cold ischemia-reperfusion injury may result in significantly lower graft survival of DCD liver transplants compared with DBD group, and further studies into how DCD allograft physiology and anatomy affect graft outcomes are warranted. However, transplant outcomes of DCD liver transplant can be improved by avoiding high-risk recipients while minimizing the DWIT and graft CIT. Liver allografts from DCD donors may be used to increase the cadaveric donor pool especially when liver transplantation using low-risk grafts can be performed in a setting that is low risk. Further correlation and study of MELD scores in DCD donor allograft recipients may determine whether transplantation of these organs will provide a survival benefit to low-risk candidates when compared to remaining on the waiting list.