If you can't find a tool you're looking for, please click the link at the top of the page to "Go to old article view". Alternatively, view our Knowledge Base articles for additional help. Your feedback is important to us, so please let us know if you have comments or ideas for improvement.
Over the last 20 years, pediatric liver transplantation (PLT) has continued to improve with better pretransplant and posttransplant management of liver disease and with longer graft and patient survival.1 Liver transplantation (LT) remains challenging because of the size of the patients, anatomical variants, previous surgeries, and the use of reduced and split cadaveric grafts and living donor grafts.
Graft dysfunction has become less common after LT in children; however, it continues to be associated with decreased graft and patient survival. Previous series reported small numbers of patients with wide variations in outcomes, and they were unable to clearly identify risk factors or early posttransplant clinical indicators that could highlight the risk of graft failure and the potential need for retransplantation.2 Decisions about retransplantation are often delayed in the hope that the liver will recover, and the deteriorating condition of recipients is associated with poor outcomes. Peritransplant predictors of early graft survival or nonsurvival may help us in making decisions about the threshold for early retransplantation. The main aims of this study were to examine a single-center experience with PLT and to identify peritransplant predictors of early graft survival and posttransplant parameters that could be used to predict early graft outcomes.
PATIENTS AND METHODS
This retrospective, single-center cohort study was conducted at the Institute of Liver Studies (King's College Hospital, London, United Kingdom) with approval from the institutional review board. The patient population included patients undergoing PLT at King's College Hospital over the course of 10 years (July 2000 to June 2010). Because this was an intention-to-treat analysis, no exclusion criteria were applied (only isolated LT). The data sources consisted of written records from the medical center and a prospectively maintained, electronic PLT database. Permission for data collection was obtained individually by written informed consent before transplantation.
The PLT procedure was performed with the piggyback technique with caval and portal vein clamping. The preservation solutions were University of Wisconsin solution [for donation after brain (DBD) and donation after cardiac death (DCD)] and histidine tryptophan ketoglutarate (for living donation). Liver splitting was performed ex situ under cold preservation with the Kelly clamp technique. The triangulation technique for hepatic vein anastomoses was performed routinely for left lateral segment (LLS) grafts.3, 4 Graft reperfusion was performed after the portal vein anastomosis. Initial arterial reperfusion was performed in a small number of cases. Bile duct reconstruction was performed via Roux-en-Y hepaticojejunostomy in all but a small number of cases. Immunosuppression was achieved with a combination of tacrolimus (0.05-0.1 mg/kg twice daily for stable levels of 10-15 mg/L) and prednisone (1 mg/kg once daily). The treatment of acute rejection consisted of high-dose methylprednisolone (10 mg/kg/day) for 3 consecutive days. Mycophenolate mofetil (10 mg/kg/day in divided doses) was used for recurrent rejection, retransplantation, and posttransplant de novo autoimmune hepatitis.5 After transplantation, patients were regularly evaluated at the clinic, and routine laboratory studies were conducted every 1 to 3 months. Prophylactic anticoagulation (enoxaparin at 0.5-0.75 mg/kg twice daily) was used routinely in the initial posttransplant period from the time at which the international normalized ratio (INR) level was <1.5 to discharge. Only high-risk cases (patients undergoing complex arterial reconstruction, patients undergoing retransplantation, and children < 5 kg) were kept on aspirin (1 mg/kg/day) for 3 months. Doppler ultrasonography was performed on posttransplant days 1 and 5, and percutaneous liver biopsy was performed when it was clinically indicated.
Variables and Outcomes
Several peritransplant and posttransplant variables were collected from a retrospective database and stratified: age; weight; sex; etiology [acute liver failure (ALF) versus chronic liver disease (CLD)]; type of graft (whole versus partial); type of partial graft (right lobe, left lobe, LLS, or monosegmental); type of donor (DBD, DCD, or living related); graft weight; graft-to-recipient weight ratio; cold ischemia time; warm ischemia time; type of graft reperfusion (arterial versus portal); and immediate pretransplant values for serum bilirubin, aspartate aminotransferase (AST), albumin, and INR. Daily posttransplant blood tests, including serum AST, bilirubin, and INR tests, were performed until posttransplant day 7 and then until discharge. Early graft survival was divided into 3 arbitrarily chosen strata: −30, −60, and −90 days, respectively.
Peritransplant and posttransplant factors were analyzed at the baseline (the t test and the χ2 test were used for comparisons). Univariate Kaplan-Meier log-rank testing was performed for all collected variables. Multivariate Cox step-forward conditional regression models were fitted to include all the variables that fulfilled at least 1 of the following conditions: a univariate P value < 0.1 or clinical relevance with or without statistical significance. Hazard ratios as well as the higher and lower endpoints of their 95% confidence intervals (CIs) were either negative or positive, with <1 and >1 defined as significant risk and protective factors, respectively. To determine the predictive power of the posttransplant variables, receiver operating characteristic (ROC) curves were developed. The area under the receiver operating characteristic curve (AUROC) was considered a useful predictor at values greater than 0.7. The sensitivity and the specificity were defined with the cutoff value that showed the highest sensitivity with the lowest “1 − specificity” values. The calculation of the number of patients who would need to be treated to prevent a given adverse outcome in 1 patient [ie, the number needed to treat (NNT)] was performed according to the following formula:
NNT was then recalculated with unadjusted proportions and with proportions adjusted for the survival outcomes of the therapeutic approach (early retransplantation). P < 0.05 was generally used for statistical significance (unless otherwise specified). All analyses were performed with SPSS 11.0 for Windows (SPSS, Inc., Chicago, IL).
ALF, acute liver failure; ALT, alanine aminotransferase; AST, aspartate aminotransferase; AUROC, area under the receiver operating characteristic curve; CI, confidence interval; CLD, chronic liver disease; DBD, donation after brain death; DCD, donation after cardiac death; INR, international normalized ratio; LLS, left lateral segment; LT, liver transplantation; MELD, Model for End-Stage Liver Disease; NNT, number needed to treat; OR, odds ratio; PLT, pediatric liver transplantation; ROC, receiver operating characteristic.
From 2000 to 2010, PLT was performed 422 times in our unit. Cadaveric donation and living donation accounted for 85.5% and 14.5% of the grafts used for transplantation, respectively. Table 1 summarizes a descriptive analysis of the population at the baseline.
Table 1. Descriptive Analysis of Peritransplant Variables for 422 PLT Procedures Performed During a 10-Year Period (July 2000-June 2010)
The data are expressed as means and standard deviations.
During the immediate posttransplant follow-up, a trend toward normalization in serum AST and INR values was observed independently of the evolution of grafts. Serum AST values showed no statistical significance (Student t test) on any days between the group with surviving grafts and the group with nonsurviving grafts. The INR values were statistically higher for the transplant patients suffering 30-day graft loss on days 1 (P = 0.007), 2 (P = 0.01), and 3 (P = 0.03). The serum bilirubin values were significantly different from posttransplant day 1 onward for grafts that subsequently survived and grafts that failed (Fig. 1).
Univariate and Multivariate Analyses
The overall 30-, 60-, and 90-day graft survival rates were 93.6%, 92.6%, and 90.7%, respectively. All the collected variables were included in a Kaplan-Meier log-rank univariate analysis. The Kaplan-Meier 30-, 60- and 90-day univariate graft survival rates were 98.2%, 97.1%, and 95%, respectively, when the day 7 bilirubin level was ≤200 μmol/L (<11.7 mg/dL) and 52.4%, 52.4%, and 52.4% when the day 7 bilirubin level was >200 μmol/L (P = 0.001; Fig. 2A); they were 95.4%, 94.5%, and 93.6%, respectively, when CLD was the cause of the transplant and 85.3%, 84%, and 80%, respectively, when ALF was the cause (P = 0.001; Fig. 2B). The survival rates for all 3 periods were 66.7%, 80%, and >90% for monosegments, left lobes, and the rest (whole grafts, right lobe grafts, and LLSs), respectively (P = 0.004; Fig. 2C). The recipient age showed no significance in the univariate analysis (Fig. 2D). In the multivariate Cox regression model, ALF and a posttransplant day 7 serum bilirubin level > 200 μmol/L achieved statistical significance in the 30- [odds ratio (OR) = 3.08 (95% CI = 1.1-8.1) and OR = 36.8 (95% CI = 13-104)], 60- [OR = 2.82 (95% CI = 1.1-7.21) and OR = 24.43 (95% CI = 9.81-60.84)], and 90-day models [OR = 4.43 (95% CI = 1.84-10.64) and OR = 23.12 (95% CI = 9.53-56.61)]. A recipient age of 2 to 6 years (versus a recipient age of 0-2 or 6-16 years) achieved Cox statistical significance for the 90-day period [OR = 0.325 (95% CI = 0.12-0.93)]. ALF and a day 7 serum bilirubin level > 200 μmol/L were independent risk factors for graft loss, whereas an age of 2 to 6 years (versus the other age categories) was an independent protective factor for graft survival (Table 2).
Table 2. Univariate Kaplan-Meier Log-Rank Testing and Multivariate Cox Regression Analysis for the 3 Variables With Statistical Significance for 30-, 60-, and 90-Day Overall Graft Survival
Day 7 Serum Bilirubin as a Clinical Indicator for Early Graft Outcomes
On the basis of the results of the univariate analysis and the strength of the multivariate analysis, a day 7 serum bilirubin level > 200 μmol/L was independently studied to determine the feasibility of using this parameter to predict early graft outcomes after PLT once other causes have been excluded. The day 7 serum bilirubin level was set at 200 μmol/L because this was the value from the ROC curve with the highest sensitivity and specificity (discussed later). Twenty-one patients (5%) had a day 7 serum bilirubin level > 200 μmol/L. A day 7 serum bilirubin level > 200 μmol/L produced AUROCs of 0.754 (95% CI = 0.52-0.98), 0.661 (95% CI = 0.5-0.82), and 0.635 (95% CI = 0.5-0.77) for 30-, 60-, and 90-day graft survival, respectively. The sensitivity and specificity of these serum bilirubin values for the prediction of 30-, 60-, and 90-day graft survival were 72.7% and 96.6%, 47.6% and 97.1%, and 34.5% and 97.1%, respectively (Fig. 3). The positive and negative predictive values of day 7 bilirubin levels for 30-day graft survival were 95.5% and 78%, respectively.
Impact of Early Retransplantation
In our series, 35 patients (8.3%) underwent retransplantation with an overall 90-day graft survival rate of 82.9%. The reasons for retransplantation included primary nonfunction (9 cases or 2.1%), chronic rejection (9 cases or 2.1%), portal vein thrombosis (1 case or 0.2%), hepatic artery thrombosis (9 cases or 2.1%), biliary complications (5 cases or 1.2%), nonthrombotic infarction (1 case or 0.2%), and disease recurrence (1 case or 0.2%). Fourteen of the 35 retransplants were performed in the first 30 days with an overall 90-day graft survival rate of 78.6%. The rates of early graft loss in children were 1.7% and 47.6% when day 7 serum bilirubin levels were ≤200 μmol/L and >200 μmol/L, respectively (χ2 = 0.001). This means that the NNT with early retransplantation for patients with initially impaired graft function would be 2.17 (unadjusted) or 2.76 (adjusted for graft survival).
Validation of the Sensitivity and Specificity of Posttransplant Blood Tests for the Prediction of Early Graft Loss
ROC curves were built for peak AST levels, INR values, and bilirubin levels after transplantation with the endpoints of 30-, 60-, and 90-day graft survival. The best AUROCs were obtained with the peak AST level (AUROC = 0.650), the day 2 INR value (AUROC = 0.697), and the day 7 bilirubin level (AUROC = 0.686). Using the product of the 3 best individual AUROC variables (the day 2 INR value, the peak AST level, and the day 7 bilirubin level), we obtained excellent prediction curves for 30- [AUROC = 0.774 (95% CI = 0.64-0.91)], 60- [AUROC = 0.752 (95% CI = 0.64-0.87)], and 90-day graft survival [AUROC = 0.715 (95% CI = 0.61-0.82)]. Excellent levels of sensitivity and specificity were obtained with this product with a cutoff of 95 × 103 U at 30 (sensitivity = 71%, specificity = 76%), 60 (sensitivity = 67%, specificity = 76%), and 90 days (sensitivity = 59%; specificity = 77%). A score for analyzing early allograft dysfunction was recently reported by Olthoff et al.6 In this model, early allograft dysfunction is defined as the presence of 1 or more of the following postoperative laboratory results reflective of liver injury and function: a bilirubin level >10 mg/dL on day 7, an INR value >1.6 on day 7, and an alanine aminotransferase (ALT) or AST level > 2000 IU/L within the first 7 days. ROC curves were calculated for a day 7 bilirubin level > 200 μmol/L, the new variable from our own AUROCs, and the Olthoff criteria (Fig. 3). Although the more complex model (ie, the new variable) had better AUROCs, the day 7 bilirubin levels led to rates of prediction similar to those obtained with the Olthoff criteria.
This study represents the largest single-center analysis of factors affecting early graft survival in the PLT population. Our intention was to describe our experience in detail, identify risk factors, and highlight indicators that could help clinicians to identify children at high risk for graft loss and to prevent unnecessary delays in listing for retransplantation.
PLT is technically demanding, and multidisciplinary teamwork is required to manage early critical graft dysfunction and to potentially intervene to rectify any problems. Retransplantation is performed in 10% to 22% of PLT recipients worldwide across all ages,7 and the figure may be as high as 30% in some series (particularly older children8 and young infants9).
Davis et al.10 recently reported a scoring prognosis tool for pediatric recipients potentially needing retransplantation that identified being on life support at the time of retransplantation, receiving a split liver graft, and having an original diagnosis of neonatal cholestasis, familial cholestasis, a paucity of bile ducts, or congenital abnormalities as adversely affecting the prognosis after regrafting. In contrast, an older age at the time of transplantation and acute rejection responding to steroids were protective factors. In their univariate analysis, a trend of better outcomes with longer intervals between transplants was observed with a significant protective effect from very late retransplants. This finding was consistent with the report from Ng et al.,11 who identified the highest risk of death after retransplantation in children receiving an allograft from a donor less than 12 months old or a reduced allograft and in recipients with a high INR at the time of retransplantation; there was a marked decrease in patient survival, particularly with early retransplantation (within 1 month of primary LT). The marked survival difference with late retransplants is due to this population being less critically ill.
Little has been reported about early graft loss. Only Sieders et al.12 stated that the Child-Pugh score, the duration of the anhepatic phase, and urgent transplantation were risk factors for early graft loss (defined as loss within 1 month of transplantation). Unfortunately, a multivariate analysis could not be performed to assess the independent contribution of each variable, and because of the long inclusion period (1982-1999), their results may not reflect current practice. However, with respect to long-term outcomes after PLT, well-defined risk factors have been identified, and they include the recurrence of malignancies, sepsis, and posttransplant lymphoproliferative disease, which account for more than 65% of late patient deaths (occurring more than 1 year after LT). Late graft losses are mainly due to chronic rejection, late hepatic artery thrombosis, and biliary strictures.13
As for the peritransplant factors examined in this study, patients with ALF and patients with an age of 0 to 2 or 6 to 16 years were 2 clearly identified groups at risk of early graft loss according to the multivariate analysis. Although technical variants were significant in the univariate analysis, they were cleared in the multivariate analysis, and this may be related to recipient age. Children between the ages of 0 and 2 years tend to receive LLS or monosegmental grafts, and children between the ages of 6 and 16 years are more likely to receive reduced grafts; both have been associated with reduced survival.9, 10 It could be argued that stratification between ALF and CLD may be too simple; however, for the analysis of very early and early graft survival after LT, this classification is enough and may simplify the analysis.
Immediate postsurgical markers that clinicians have used to identify at-risk recipients include the 50-50 criteria (prothrombin level < 50%, serum bilirubin level > 50 μmol/L): after hepatectomy, the relative risk of death is increased up to 66-fold if these criteria are fulfilled on postoperative day 5 (sensitivity = 69.6%, specificity = 98.5%).14 In LT recipients, Briceño et al.15 reported that the daily posttransplant evolution of the Model for End-Stage Liver Disease (MELD) score showed an excellent exponential correlation according to a regression curve: for a hypothetical patient with a posttransplant MELD score of 30, the probability of dying within 1 week of LT was 26.19%, whereas for a patient with a post-MELD score of 40, the probability was 84.38% (positive predictive value = 82.3%, negative predictive value = 33.1%). No reports that have performed a similar analysis in the pediatric setting have been identified. Recently, not in adults, Olthoff et al.6 reported that any of the following criteria was a good predictor of graft survival: a bilirubin level > 10 mg/dL on day 7, an INR value > 1.6 on day 7, and an ALT or AST level > 2000 IU/L within the first 7 days. Although the Olthoff criteria have been previously validated, we compared them with our own score, which was calculated in a more precise and statistically correct way. We performed comparisons of the Olthoff criteria with 2 other criteria: a very strict posttransplant AST-INR-bilirubin combination and a day 7 bilirubin level > 200 μmol/L. Although the highest AUROC was obtained from the product of the peak AST level, day 2 INR value, and day 7 bilirubin level, a day 7 bilirubin level > 200 μmol/L led to AUROCs similar to those achieved with the Olthoff criteria, and this reflected the fact that it was the easiest score; however, when the AST-INR-bilirubin combination was properly statistically performed, it achieved much more accurate AUROCs.
In our first analysis, serum AST levels and INR values after transplantation were not truly important markers of outcomes. In contrast, a serum bilirubin level > 200 μmol/L on day 7 was associated with a lower survival rate of 52.4% versus 98.2% for patients with lower bilirubin levels. The 30-day graft survival sensitivity and specificity values were 72.7% and 96.6%, respectively. Thus, if the day 7 serum bilirubin level after PLT is ≤200 μmol/L, the probability of graft loss in the first 3 months after transplantation is less than 3.5%. More than 70% of recipients with serum bilirubin levels > 200 μmol/L will lose their grafts in the first month, and interventions should be directed at identifying the causes of graft dysfunction and improving it. Early retransplantation in our series had excellent survival rates; if early retransplantation were performed for children with day 7 serum bilirubin levels > 200 μmol/L, 1 child on average would be saved in every 2.7 cases, and this would improve outcomes in approximately 33%. In our opinion, early retransplantation before deterioration may be associated with better outcome rates than the 50% survival rate after first retransplantation reported in the literature.10
A limitation due to the design of our study was the fact that only 16 variables were collected. Apart from that, a bias regarding self-fulfilling criteria must be considered. However, a clear message can be obtained from our experience: early retransplantation is a feasible option that can prevent worsening of the patient; thus, a decision must be considered before this situation, and our scores (the product of the peak AST level, day 2 INR value, and day 7 bilirubin level) may be helpful because no specific scores for the pediatric population have been reported. The main aim of this study was to identify pretransplant and peritransplant variables that had been comprehensively collected and were feasible. Our analysis has several important advantages in comparison with previous reports. First, it is a single-center analysis that, in comparison with national database studies, avoids discrepancies usually ignored in the latter studies. Second, this is the largest series reported to date from a single institution that analyzes early graft outcomes. Third, the statistical method was consistent and powerful and allowed for univariate and multivariate analyses and strong sensitivity and specificity tests. Fourth, the recruitment period was recent, so confounding factors such as changes in immunosuppression and surgical techniques were potentially avoided. Last, because this was an intention-to-treat analysis, we tried to avoid exclusion criteria to make our results easily exportable to other PLT centers. Obviously, this implies a lack of subgroup analysis, which makes our results stronger because this common practice is sometimes statistically incorrect.
In conclusion, ALF and an age of 0 to 2 or 6 to 16 years are risk factors for early graft loss after PLT. Posttransplant INR values and transaminase levels are not useful for predicting early graft outcomes. The product of the peak AST level, day 2 INR value, and day 7 bilirubin level and a posttransplant day 7 bilirubin level > 200 μmol/L are clinically valuable tools with high accuracy for predicting early graft loss, and they should alert clinicians to potential problems that may require intervention. This raises the question of whether a more aggressive attitude to early retransplantation should be adopted to further reduce early mortality after PLT.