Meeting report of the 19th Annual International Congress of the International Liver Transplantation Society (Sydney Convention and Exhibition Centre, Sydney, Australia, June 12-15, 2013)

Authors

  • Gabriel C. Oniscu,

    Corresponding author
    1. Scottish Liver Transplant Unit, Royal Infirmary of Edinburgh, Edinburgh, United Kingdom
    • Address reprint requests to Gabriel C. Oniscu, M.D., F.R.C.S., Scottish Liver Transplant Unit, Royal Infirmary of Edinburgh, Little France Crescent, Old Dalkeith Road, Edinburgh EH16 4SA, United Kingdom. E-mail: gabriel.oniscu@ed.ac.uk

    Search for more papers by this author
  • Geraldine Diaz,

    1. Department of Anesthesia and Critical Care, University of Chicago Medicine, Chicago, IL
    Search for more papers by this author
  • Josh Levitsky

    1. Division of Gastroenterology and Comprehensive Transplant Center, Northwestern University Feinberg School of Medicine, Chicago, IL
    Search for more papers by this author

  • Abstracts mentioned in this article (which are cited in the O-# format) can be found in Liver Transplantation 2013;19(suppl 1):S86-S334.

  • The authors have no disclosures to make.

Abstract

The International Liver Transplantation Society held its annual meeting from June 12 to 15 in Sydney, Australia. More than 800 registrants attended the congress, which opened with a conference celebrating 50 years of liver transplantation (LT). The program included series of featured symposia, focused topic sessions, and oral and poster presentations. This report is by no means all-inclusive and focuses on specific abstracts on key topics in LT. Similarly to previous reports, this one presents data in the context of the published literature and highlights the current direction of LT. Liver Transpl 20:7–14, 2014. © 2013 AASLD.

Abbreviations
AFP

alpha-fetoprotein

ALI

acute lung injury

DSA

donor-specific antibody

EVL

everolimus

HBIG

hepatitis B immunoglobulin

HCC

hepatocellular carcinoma

HCV

hepatitis C virus

ILTS

International Liver Transplantation Society

LAS

liver allocation system

LDLT

living donor liver transplantation

LT

liver transplantation

MELD

Model for End-Stage Liver Disease

PCAT1

prostate cancer–associated transcription 1

TAC

tacrolimus

TEE

transesophageal echocardiography

The International Liver Transplantation Society (ILTS) held its 19th annual congress in Sydney, Australia. The meeting included several thematic symposia focused on advances in acute liver failure, hepatitis C virus (HCV) in the context of liver transplantation (LT), advances in liver regenerative medicine, the role of LT in alcoholic liver disease, the management of high-risk recipients, ways to improve renal outcomes after LT, and achieving tolerance. A wide range of topics were covered by oral presentations, poster sessions, video, and interactive discussions. In the context of the published literature, this report focuses on several key topics such as viral hepatitis, acute/chronic liver failure, perioperative management, immunosuppression, rejection, tolerance, renal/metabolic outcomes, liver cancer, expansion of the donor pool, living and split donor LT, other extended criteria donors, and allocation.

VIRAL HEPATITIS AND RECURRENT DISEASE

A number of meeting abstracts focused on clinical and biomarker predictors of more rapid HCV recurrence and associated complications. O'Leary et al. (O-46) showed that preformed and de novo donor-specific antibodies (DSAs) increased HCV fibrosis and mortality after LT. This correlated with their recent publication on the impact of DSAs on LT outcomes,[1] but now they made specific reference to the higher risk HCV population. A novel cluster of differentiation antibody microarray was shown to help differentiate mild and severe HCV recurrence, with pretransplant cluster of differentiation antibody signatures before transplantation interestingly being the strongest predictors of recurrence (O-115). Tripon et al. (O-48) presented interesting data on factors associated with ascites development early after LT in HCV+ recipients without cirrhosis on early biopsy. A multivariate analysis showed that pretransplant refractory ascites, stage 2 fibrosis 1 year after transplantation, perisinusoidal fibrosis, and cryoglobulinemia were tied to early ascites development, the latter 2 factors suggesting microcirculatory changes leading to portal hypertension. These data support the concept that pressure gradients can increase after LT in the absence of cirrhosis and are predictive of worse complications.[2, 3] Finally, Song et al. (O-76) identified several variables associated with improved survival after retransplantation for HCV: negative HCV before transplantation, antivirals after LT, the absence of a split graft, a younger age, and a genotype other than 1. Thus, the careful selection of patients for retransplantation can help to prolong patient and graft survival.[4, 5]

Novel data on treatment with triple therapy (pegylated interferon, ribavirin, and a protease inhibitor) for HCV before transplantation and recurrent HCV were presented at the meeting. During the Rising Star Symposium, Saxena et al. (O-6) showed that female sex was a strong predictor of adverse events (eg, anemia) during triple therapy. Later in the meeting, Saxena et al. (O-43) compared the outcomes of triple therapy for pretransplant patients with cirrhosis by the Child class: A versus >A. The sustained virological response rate was better for Child class A patients (55%) versus non-A patients (42%), but the early rapid virological response rate was most predictive of a sustained response. However, tolerability was in general worse with more advanced decompensation, and this finding was similar to the results of other recent studies.[6] In both pretransplant patients with cirrhosis and post-LT patients, Vinaixa et al. (O-44) similarly showed worse tolerability and outcomes in comparison with those for a noncirrhotic, nontransplant population.

As for hepatitis B, there were 2 studies of successful hepatitis B immunoglobulin (HBIG) withdrawal in conjunction with potent antivirals with minimal or no resistance (O-112 and O-117). No hepatitis B virus recurrence was seen, and this finding was similar to the results of other studies of either no HBIG or HBIG withdrawal in the setting of antiviral therapies with high genetic barriers to resistance.[7, 8] Another abstract (O-116) showed tremendous cost savings ($28,000 per year) from the cessation of HBIG in favor of tenofovir plus emtricitabine, and it supported this practice and recent data.[9]

A few notable presentations focused on long-term recurrent disease and histology. Ravikumar et al. (O-110) showed significantly less recurrent primary sclerosing cholangitis (8% versus 25.8%) in patients undergoing pretransplant/intraoperative colectomy versus patients undergoing post-LT colectomy for inflammatory bowel disease. This supports the notion that the presence of colitis may be the major cofactor for the recurrence of primary sclerosing cholangitis and may be tied to innate immune mechanisms.[10] Kim et al. (O-118) reported the results of 200 post-LT protocol biopsies for patients who mainly underwent transplantation for hepatitis B virus cirrhosis and alcohol/fatty liver disease: 25% had steatosis (mostly mild), and only 8% had steatohepatitis. They identified risk factors for steatosis and not surprisingly included a history of alcoholic cirrhosis, body mass index, and obesity. However, long-term follow-up biopsies and clinical data are needed to determine whether steatosis and, in particular, steatohepatitis progress to more advanced fibrosis in the LT population.

IMMUNOSUPPRESSION

Much of the ILTS data on immunosuppression in LT recipients focused on everolimus (EVL), a molecular target of rapamycin inhibitor that was recently approved for use in this population. Schlitt et al. (O-8) showed 36 months of data from the PROTECT study, in which LT recipients were randomized at 4 to 8 weeks to EVL with tacrolimus (TAC) elimination or continuation.[11] The TAC withdrawal group was shown to have a persistent 9- to 10-mL glomerular filtration rate improvement at 36 months without an increased rate of rejection. This is in contrast to the recently published H2304 study (Everolimus with reduced Tacrolimus), in which the TAC withdrawal group was terminated early because of a higher rate of rejection despite improved renal function.[12, 13] De Simone et al. (O-135) then presented 24-month data from the H2304 study and compared maintenance TAC with a combination of EVL and reduced (not withdrawn) TAC. The latter group experienced less rejection, and similarly to Schlitt et al.'s study, that group had better renal function than the standard TAC group at 2 years. Although there was more leukopenia, edema, hyperlipidemia, and proteinuria in the EVL group (O-78), the discontinuation rates were similar. Mild proteinuria was higher in the first 6 months but was reduced by the end of 24 months and had no obvious clinical significance. Cillo et al. (O-9) showed data from a study in which patients were randomized on day 7 after LT to EVL and reduced TAC (with a plan for TAC weaning on day 30) or standard TAC dosing. Although there was no difference in the primary endpoint (biopsy-proven rejection) at 3 months, those who underwent TAC weaning (40% of the group) had higher rejection rates. Thus, although it is clear that the use of EVL and early TAC minimization can improve renal function without an increased risk of rejection, this does not appear to be the case with TAC withdrawal in terms of rejection. Further clinical and biomarker predictors are needed to identify patients who can successfully be withdrawn from TAC and maintained on calcineurin inhibitor–free EVL regimens.

Two notable studies reported the impact of different immunosuppressive therapies and HCV fibrosis progression. Saliba et al. (O-47) presented data on the HCV population in the H2304 study and demonstrated a trend toward less fibrosis progression at 24 months in recipients randomized to EVL and reduced TAC versus TAC alone. Although the data are preliminary, other data have supported the finding of less fibrosis development with the use of molecular target of rapamycin inhibitors versus calcineurin inhibitor therapy.[14] Levy (O-7) reported on a large randomized study of de novo cyclosporine versus TAC in LT recipients with HCV. Although there was no difference in the development of stage 2 or higher fibrosis, the cyclosporine patients not given corticosteroids had less recurrence at 2 years. This may just reflect the impact of corticosteroids on HCV disease rather than the choice of calcineurin inhibitor therapy.[15]

IMMUNE COMPLICATIONS AND TOLERANCE

Biomarkers correlating with rejection or the risk of rejection were the focus of a number of ILTS abstracts. A group from Baylor (O-81) presented further data on the impact and importance of DSAs in LT recipients. Similarly to recent work,[16, 17] this group showed that, with or without induction therapy, preformed class I and II antibodies were associated with the development of liver fibrosis, rejection, and death in the LT population. The Baylor group presented another DSA abstract (O-85) and showed that African Americans receiving a race-mismatched graft had an increased risk of graft loss and death that was potentially due to the presence of preformed DSAs. Sood et al. (O-15) presented a novel immune monitoring assay (QuantiFERON Monitor) for liver recipients that may guide weaning from immunosuppression and the avoidance of rejection. Unlike the commercially available ImmuKnow assay (Cylex), this assay assesses both adaptive (CD3) and innate (toll-like receptor) responses, which are becoming increasingly important in the transplant setting.[18]

One of the highlights of the meeting was a presentation of data from the Enhancing Adherence to Immunosuppression After Liver Transplantation study (O-11). This was a large prospective, longitudinal assessment of nonadherence by self-reporting in liver recipients and included correlations with subsequent outcomes. Nearly one-third of the patients were considered nonadherent because they deviated from the immunosuppression dosing schedule. This was predicted by a lower intention to adhere and a higher barrier to adherence and correlated with the development of rejection over time in comparison with the adherent group. However, nonadherence did not have an impact on graft survival, and this may have been due to the tolerogenicity of the liver for handling underimmunosuppression and rejection in comparison with other organs.[19]

In reference to LT tolerance, a number of exciting, cutting-edge abstracts were delivered at the meeting. One study demonstrated that the failure to wean 20 adult recipients of living donor liver transplantation (LDLT) from immunosuppression was predicted by high adenosine triphosphate levels in the ImmuKnow assay and donor-specific mixed lymphocyte reactions (O-13). During the Rising Star Symposium, Wozniak et al. (O-1) reviewed data from the University of California Los Angeles experience with 17 mostly male patients who either stopped or were weaned from immunosuppression. Although higher levels of regulatory T cells and regulatory B cells on immunophenotyping were present for this group, more than half had abnormal liver biopsies in the long term. Other reports have shown similar graft abnormalities in the long term in presumably tolerant patients, and this raises concerns about whether these patients are truly tolerant or are experiencing low-grade immune activation and injury.[20] Finally, Yamashita et al. (O-134) presented a novel regulatory T cell infusion protocol for achieving tolerance in 10 LDLT recipients. Using ex vivo expanded, donor-specific regulatory T cells that were re-infused into the recipients postoperatively, they weaned 5 patients from immunosuppression therapy. The infused cells inhibited donor/recipient (not third-party) mixed lymphocyte reactions. Although the report was preliminary (<1 year of follow-up), such protocols involving regulatory T cell infusion or other immune manipulations to achieve early, donor-specific tolerance may be more effective and beneficial than late, simple weaning.[19]

ALLOCATION

Liver allocation was a frequent topic, and authors from various countries examined the efficacy and equity of different liver allocation systems (LASs). Dhanireddy et al. (O-34) analyzed United Network for Organ Sharing data on regional variations within the United States with respect to wait-list times, dropout, transplant rates, and 1- and 3-year survival among patients with hepatocellular carcinoma (HCC). The authors identified significant regional variations in the time to transplantation, wait-list dropout, and transplant rates, with candidates in the most competitive regions being disadvantaged. The overall 1- and 3-year survival rates among HCC recipients were not significantly different. Allocation inequity in the United States has been widely recognized[21, 22]; however, Dhanireddy et al.'s report is unique in demonstrating the impact of a transplant center's selection bias on patients undergoing LT for HCC.

Vagefi et al. (O-38) identified allocation inequity among candidates with multiple listings at US transplant centers. The authors demonstrated that multiple listings provided a significant advantage as determined by the time to transplantation and the Model for End-Stage Liver Disease (MELD) score at transplant. Patients with multiple listings underwent transplantation shortly after listing at their second center with a lower MELD score and a higher incidence of donation after cardiac death allografts. Patients with multiple listings were more likely to be male and Caucasian, have a college education, blood type O, and private insurance and have HCC or an HCV infection. The authors noted that this competitive advantage was enjoyed by only a very small cohort of patients (<4%) but that this practice had quadrupled since 2005. This disturbing abstract suggests a clear advantage for candidates who are knowledgeable of US allocation practices and have access to the resources necessary to be listed multiple times. The authors concluded that this advantage can be negated only by equalization of access. Allocation inequity is currently under review by the Liver and Intestinal Transplant Committee of the United Network for Organ Sharing.[23]

Cejas et al. (O-37) presented Argentinean Transplant Society data that identified a competitive advantage for candidates who received wait-list prioritization through MELD exception points. Argentina was the first country to follow the United States in implementing a MELD-based LAS.[24] In their study of all transplant candidates from 2005 to 2011, patients who received MELD allocation priority exceptions had increased access to transplantation, decreased wait-list mortality, and better posttransplant survival than patients who underwent transplantation with an equivalent physiological MELD score or had been denied prioritization. Vitale et al. (O-5) from Padua University reported similar findings among patients receiving HCC prioritization in the Italian MELD-based LAS. In their study, HCC recipients receiving prioritization demonstrated a significant survival benefit. The advantage of wait-list prioritization, particularly for the diagnosis of HCC, has been well documented and debated by numerous authors.[25-27] In their reports, Cejas et al. and Vitale et al. endorsed further modification of their current LASs to achieve equity.

Jacquelinet et al. (O-41) presented an update on the French LAS. Over the past 5 years, the French have been gradually moving away from geographic allocation to MELD-based allocation as reported at their most recent consensus conference.[28] This process has resulted in a series of LAS revisions focused on decreasing wait-list mortality and dropout through improved access. Jacquelinet et al. provided data from the most recent implementation that demonstrated increased access to transplantation and decreased wait-list mortality with no significant effect on posttransplant survival for patients with high MELD scores.

PREOPERATIVE MANAGEMENT

Candidate preparation during the wait for LT was explored through several presentations evaluating factors that affect transplant outcomes. Bucur et al. (O-35) from Paul Brousse Hospital presented data defining sarcopenia through preoperative computed tomography scan measurements of the psoas muscle area and the total skeletal muscle area at the level of the third or fourth lumbar vertebra, and they were significant, independent predictors of survival for deceased donor allograft recipients. The skeletal muscle mass was associated with sex, body mass index, and the absence of ascites. Sex-specific cutoff values for sarcopenia were developed to identify recipients at high risk for posttransplant morbidity. The psoas muscle area was the easier indicator to evaluate, and it demonstrated greater statistical power in a multivariate analysis. Their findings correlate with a growing body of literature on the importance of nutritional therapy in patients awaiting LT.[29, 30] Completely new data on living donor recipients were reported by Kaido et al. (O-36) of Kyoto University, who used a body composition analyzer to define sarcopenia. In their landmark study, the authors demonstrated that sarcopenia strongly correlated with posttransplant mortality after LDLT. Furthermore, pretransplant nutritional replacement and rehabilitation improved overall survival. The authors' data were based on the efficacy of the branched-chain amino acid/tyrosine ratio and body cell mass as indicators of sarcopenia.

Candidates with primary sclerosing cholangitis and other cholestatic liver diseases are widely believed to be underrepresented by MELD-based allocation. Since the introduction of a MELD-based LAS, appropriate risk stratification for these patients has been widely debated.[31] Helmke et al. (O-39) of the University of Colorado moved this discussion forward through the introduction of a novel predictor of wait-list acuity for patients with primary sclerosing cholangitis. In their study, the authors identified cholate clearance from serum samples as a highly significant and, in comparison with the MELD score, more powerful predictor of wait-list mortality among patients with primary sclerosing cholangitis. Relative thresholds that accurately predicted hepatic decompensation and the need for LT were determined.

The pretransplant screening and management of coronary artery disease continue to evolve. The recognition of end-stage liver disease as a predictor of coronary artery disease (rather than a protector against disease progression) and the description of cirrhotic cardiomyopathy have increased efforts to reliably screen individuals undergoing an evaluation for LT.[32, 33] Using the Standard Transplant Analysis and Research database, Fukazawa et al. (O-71) from the University of Miami performed an analysis of 17,482 candidates to determine the prevalence of coronary artery disease by the etiology of liver disease. Their analysis demonstrated widely varying incidence of coronary artery disease by indication ranging from >7% for those with nonalcoholic steatohepatitis to <25% for patients with cholestatic liver disease. Bezinover et al. (O-72) from Penn State Hershey Medical Center evaluated the efficacy of myocardial perfusion imaging with single-photon emission tomography in the pretransplant evaluation of 173 patients. They determined that myocardial perfusion imaging had positive and negative predictive values similar to those of dobutamine stress echocardiography, but it may be more applicable because of the potential for continuing beta-blockade therapy. A positive myocardial perfusion image strongly correlated with the presence of coronary artery disease.

ANESTHESIA AND CRITICAL CARE

The intraoperative monitoring of patients undergoing LT was a focal point of discussion with new technologies becoming increasingly available. Soong et al. (O-128) from Northwestern University conducted a survey study to determine the utilization of transesophageal echocardiography (TEE) during LT by US transplant programs. The authors' data support a previous review by Della Rocca et al.[34] indicating that overall utilization of TEE has markedly increased. Approximately 40% of US transplant centers currently employ TEE for all transplant procedures, with 55% reserving the modality for specific indications, and only 5% of responding centers indicated no application of TEE. Center transplant volume was not a predictor of TEE utilization, but an anesthetic team that incorporated cardiac anesthesiologists was a highly significant predictor of TEE utilization. The reason most frequently reported for not using TEE was the opinion that TEE was unnecessary, which was followed by insufficient training.

The incidence of morbidity due to invasive hemodynamic monitoring among more than 1200 LT cases was evaluated by Lu et al. (O-129) from the University of Pittsburgh. Invasive hemodynamic monitors were classified as radial artery catheters, femoral artery catheters, 9-Fr internal jugular venous catheters, 18-Fr venovenous bypass catheters in the internal jugular vein, TEE probes, and 17-Fr venovenous catheters in the femoral vein. The overall incidence of complications was <2%, with most classified as grade II or III according to the Clavien-Dindo system. Morbidity was most frequently associated with the insertion of femoral catheters (either venous or arterial). The authors proposed the brachial artery as a potential alternative to femoral artery catheters, but results are unavailable.

Zhao et al. (O-130) investigated the incidence, risk factors, and clinical outcomes of acute lung injury (ALI) after orthotopic LT. Postoperative ALI was observed in 4.3% of more than 1300 transplant procedures. A multivariate analysis identified encephalopathy, a requirement for pretransplant mechanical ventilation, total bilirubin, and intraoperative transfusion requirements as significant predictors of ALI. Posttransplant ALI was associated with prolonged mechanical ventilation, an increased hospital stay, and graft failure. These data demonstrated a decreasing incidence of ALI and improved survival in comparison with earlier reports.[35, 36]

Kaliamoorthy (O-133) from Chennai, India demonstrated improved outcomes in a developing program after the institution of a structured anesthesia and critical care team. The authors compared the outcomes of deceased donor and living donor recipients and demonstrated that a uniform clinical algorithm by a small, dedicated team improved outcomes despite a more complex and critically ill patient population. This practice also significantly decreased resource utilization. This study supports North American data demonstrating improved outcomes and reduced intraoperative resource utilization through dedicated anesthesia teams in established programs.[37, 38]

LIVER CANCER

The primary focus of the HCC sessions this year was the identification of molecular markers for diagnosis and the prediction of recurrence.

Ding et al. (O-80) presented the results of a study investigating the role of long noncoding RNA [prostate cancer–associated transcription 1 (PCAT1)] in stratifying the HCC risk. In this study of 184 HCC explant livers, a high level of expression of PCAT1 appeared to be an independent predictive factor for HCC recurrence and survival after transplantation. Furthermore, even in patients within the Milan criteria, a high level of expression of PCAT1 was associated with shorter recurrence-free survival. These results were confirmed in a multivariate analysis, which accounted for age, tumor size, number, and grade, and preoperative alpha-fetoprotein (AFP) levels. This novel biomarker may allow better stratification of transplant candidacy and could serve as a potential therapeutic target. Long noncoding RNAs have been recently recognized as novel therapeutic targets in human disease progression, and in particular, deregulation appears to be associated with an increased risk of cancer progression.[39-41] Guo et al. (O-26) reported that the expression of ZIP4 was increased in HCC tissue, and overexpression was an independent prognostic factor for poor survival after LT. Inhibition of ZIP4 expression resulted in decreased invasiveness, and this offers a potential therapeutic target for these patients.

In keeping with published data supporting the role of AFP,[42, 43] several abstracts discussed its role in selecting appropriate candidates for transplantation. Lai et al. (O-103) suggested that an increase in AFP corroborated by radiological progression would improve the selection of transplant candidates by excluding those with a significantly higher risk of recurrence. On a similar note, Yoo et al. (O-104) suggested that patients with a high pretransplant AFP level in whom the AFP level fails to normalize after transplantation are at a higher risk for tumor recurrence.

The role of LT for other malignancies such as intrahepatic cholangiocarcinoma/mixed hepatocellular-cholangiocarcinoma and metastatic neuroendocrine tumors remains controversial.[44-47] A multicenter case-control study from Spain (O-108) suggested that patients with incidental mixed hepatocellular cholangiocarcinoma achieved transplant outcomes comparable to those of HCC patients, so a preoperative diagnosis of hepatocellular cholangiocarcinoma should not exclude such patients from transplantation. A multicenter database report from the United States (O-109) suggested that in the modern era of transplantation, the overall survival of patients undergoing LT for neuroendocrine tumors approaches that for other indications, with a 1-year survival rate of 88%. Therefore, the authors suggested that these patients should be considered for transplantation under the MELD exception criteria.

EXPANDING THE DONOR POOL

Because of the increase in donation after cardiac death and extended criteria donor donation and the advent of new perfusion technologies,[48, 49] the expansion of the donor pool was the topic of many clinical and experimental presentations.

In an experimental model, the Cleveland group (O-74) investigated the role of normothermic machine perfusion and explored the use of various perfusate solutions. Normothermic machine perfusion with blood provided better outcomes than other preservation solutions and was associated with a lower incidence of biliary damage. This suggests that normothermic oxygenation may lead to the prevention of biliary strictures in donation after cardiac death LT.

Continuing the theme, den Dries et al. (O-91) presented the results of a study investigating the role of ex vivo normothermic machine perfusion in 4 discarded human livers. In this study, the livers were perfused for 6 hours with a pulsatile arterial flow and a continuous portal flow using red blood cells, fresh frozen plasma, nutrients, and trace elements. A histological examination revealed a preserved liver morphology with no evidence of ischemic damage. The authors concluded that normothermic machine perfusion enables graft assessment before transplantation and may allow organ reconditioning before transplantation.

Dutkowski et al. (O-75) presented the results of a pilot study using a hypothermic oxygenated machine for donation after cardiac death human livers. In a paired comparison, the study showed that there was no difference between hypothermic oxygenated machine–perfused livers and a matched cohort of donation after brain death recipients in terms of immediate function and renal function. The results are encouraging and suggest that ex situ oxygenated machine perfusion is safe. However, long-term follow-up data are required to fully assess the benefits of the technology.

Donor risk stratification remains a topical issue. In a study from Eurotransplant, Silberhumer et al. (O-84) compared a European nomogram risk with the US donor risk index. The study identified a weak correlation between the 2 scores that may reflect differences in the donor pools of Europe and the United States. In the European model, only the cold ischemia time and the donor death modality significantly affected 12-month survival. The authors reported that the agreement between different prediction models was weak, and they suggested that any prediction model has a limited clinical impact for donor evaluation.

LIVING AND SPLIT DONOR LIVER TRANSPLANTATION

Living donors continued to be an important topic at the meeting. Adam et al. (O-16) presented a report from the European Liver Transplant Registry comparing 3090 adult living related LDLT cases to 71,307 donation after brain death full liver grafts. LDLT represented 4.3% of all LT activity in Europe and had outcomes comparable to those of donation after brain death LT at 1, 3, and 5 years. Right lobe grafts had better survival than left lobe grafts, and this was in keeping with previous reports.[50] The donor mortality rate was 0.2%, and the early donor morbidity rate was 13%. The authors identified 10 independent risk factors for graft survival, including age, ABO compatibility, disease stage, a cold ischemia time > 120 minutes, a donor age > 50 years, number of LT procedures per center, and HCC as an indication.

Because of the lack of deceased donor transplantation in Asian countries, retransplantation with LDLT has been considered, although it remains controversial. Dai et al. (O-17) reported the results of retransplantation for 10 cases with right lobe grafts, and they compared the outcomes to those for a contemporary cohort of retransplant patients with deceased donors. The indications were primarily related to hepatic artery thrombosis and biliary complications, and both groups had significant rates of complications and reinterventions. Re-LDLT appears to be associated with satisfactory long-term outcomes (88.9% at 1, 3, and 5 years) but remains a high-risk operation with significant morbidity and the need for further interventions.

In keeping with data presented at this meeting for deceased donor transplantation, Joo et al. (O-19) reported on the impact of DSAs on the outcome of LDLT for 32 DSA-positive patients. Although the presence of DSAs did not affect the incidence of vascular or biliary complications, patients with multiple DSAs and a panel reactive antibody score > 30% had poorer outcomes than patients with a single DSA and a lower panel reactive antibody score.

Moon et al. (O-20) presented a comprehensive update of the experience of the Asan Medical Center, where more than 300 LDLT procedures per year were performed over the last few years with a 5-year survival rate in excess of 88%.[51] The mortality rate has remained zero, and innovations such as laparoscopic hepatectomy for left lateral segments, dual grafts, and ABO-incompatible transplants are now common practice.

Split LT continues to be a significant component of clinical practice,[52] and this is reflected by the number of presentations at the meeting. Adam et al. (O-142) presented the outcomes of a 23-year study from the European Liver Transplant Registry. During this period, first split LT was performed 4910 times in Europe. Despite higher rates of graft loss and technical complications in comparison with full size LT, the outcomes have improved significantly and are now almost equivalent to those of full size LT.[53] Given these improvements, the authors proposed that centers should be incentivized to expand the use of split LT in pediatric and adult recipients.

These results were confirmed by a large single-center report from Birmingham, United Kingdom (O-61), where an 82% 10-year survival rate was achieved for pediatric recipients and a 62% 10-year survival rate was achieved for adult recipients. Given these results, Maggi et al. (O-86) suggested that extended right grafts from split LT should not be considered marginal grafts any longer and should be used routinely in all patients.

CONCLUSION

The 2013 ILTS meeting in Sydney, Australia was a productive conference with an emphasis on new viral hepatitis treatments, immunosuppression adherence, recipient selection, extended criteria donation, and ways of increasing the donor pool. The meeting also identified new directions in organ preservation and in the optimization of donor graft quality.

Ancillary