In comparison with other surgical procedures, liver transplantation has a significant mortality rate in the early posttransplant period.1 Early posttransplant mortality is likely related to both the patient's pretransplant condition and the quality of the donor organ.2 Although the severity of preoperative liver disease, as defined by recognized clinical scoring systems [eg, the Model for End-Stage Liver Disease (MELD) and the UK Model for End-Stage Liver Disease (UKELD)], is a predictor of mortality without transplantation,2, 3 these systems do not consistently predict posttransplant survival.4 This has led to the development of multivariate prognostic models for predicting posttransplant mortality.5-9 However, a recent systematic review has demonstrated that none of these prognostic models discriminate well between survivors and nonsurvivors in the early posttransplant period.10 A more recent model, which was developed from European Liver Transplant Registry data, is more predictive but includes operative variables; this limits its usefulness for preoperative risk assessment.11
Cardiopulmonary exercise testing (CPET) is a safe, noninvasive, and dynamic measure of an individual's cardiorespiratory reserve.12 CPET has been used in a variety of surgical settings to predict the risks of perioperative morbidity and mortality.13-15 A previous study revealed that a low cardiorespiratory reserve, which was determined objectively through the calculation of the anaerobic threshold (AT), was associated with early mortality after liver transplantation.16 This study used population based predicted values for cardiorespiratory fitness (eg the percentage of the predicted AT value and the peak oxygen uptake (VO2)) but these have not been validates in any other study investigating perioperative risk. Furthermore, the putative interaction between preoperative CPET and donor variables with respect to outcome predictions was not included in the analysis.
Because of the clinical importance of preoperative assessments in liver transplantation, we assessed whether the determination of the cardiopulmonary reserve by noninvasive CPET in an unselected group of patients being considered for liver transplantation was feasible. In the patients who underwent transplantation, the efficacy of CPET in the prediction of early transplant survival was compared with the efficacy of other preoperative measures, including the resting cardiorespiratory status (echocardiogram), liver disease severity (MELD/UKELD), and donor variables (donor risk index,17 donor age, and donor sodium levels).
AT, anaerobic threshold; AUROC, area under the receiver operating characteristic curve; CI, confidence interval; CPET, cardiopulmonary exercise testing; FFP, fresh frozen plasma; MELD, Model for End-Stage Liver Disease; ROC, receiver operating characteristic; UKELD, UK Model for End-Stage Liver Disease; VE, minute ventilation; VCO2, carbon dioxide production; VO2, oxygen uptake.
PATIENTS AND METHODS
Patient Population and Study Procedures
One hundred ninety-nine nonemergency patients were assessed for liver transplantation over a 3-year period at a single center (Freeman Hospital, Newcastle upon Tyne, United Kingdom). The study received a priori approval from our institutional research and development committee. One hundred eighty-two of these patients underwent CPET. Capacity issues prevented the remaining 17 patients from undergoing a CPET assessment. After written informed consent was obtained, the patients underwent CPET to be screened for asymptomatic heart disease and to provide objective data on their cardiopulmonary reserve. A positive test for significant cardiac ischemia led to a cardiological referral for further investigation and management. The transplant assessment process was blinded to information about the cardiopulmonary reserve, and the results did not influence the listing decision or the subsequent allocation of organs.
The patients underwent a maximal progressive exercise test on an electronically braked ergometer (Lode, Groningen, the Netherlands). During the test, expired gases were collected, and they were analyzed off-line for minute ventilation (VE), VO2, carbon dioxide production (VCO2; Scott Medical, Plumsteadville, PA), and cardiac function, which was measured with 12-lead electrocardiography (Welch Allan, New York, NY). Flow and gas calibrations were performed manually before each test. The increment in the work rate was predetermined with equations for an estimate of the expected work capacity, and the goal was a test duration of approximately 6 to 10 minutes.18 The test was terminated either because of voluntary symptoms (fatigue, pain, and light-headedness) or because of a failure to maintain the appropriate speed on the ergometer (revolutions per minute) for more than 30 seconds despite encouragement. The cardiopulmonary reserve was calculated with the V-slope AT method,19 and the peak oxygen consumption was calculated as the highest VO2 value during the final 30 seconds of the exercise.
Dobutamine Stress Echocardiography
The patients also underwent dobutamine stress echocardiography when it was indicated by predefined clinical guidelines. These guidelines included any patient with a previous history of ischemic heart disease, abnormal resting echocardiography findings, or an abnormal electrocardiogram. Dobutamine stress echocardiography was also performed for patients who were more than 55 years old or older and had 1 risk factor and for patients under 55 years with 2 risk factors (the risk factors included hypertension, peripheral vascular disease, diabetes, renal insufficiency, previous cerebrovascular disease, and a body mass index > 35 kg/m2).
Clinical Data Collection
Data, including the patient demographics and the underlying disease types, were prospectively collected before CPET. The MELD/UKELD scores were calculated at the time of CPET and immediately before transplantation. Donor variables, including the donor risk index,17 the donor's age, the predonation sodium levels, and the length of time on the waiting list, were also collected at the time of transplantation. Perioperative variables, including blood loss and operative times, were collected as well.
For an assessment of mortality, all patients who underwent liver transplantation were prospectively followed up on postoperative day 90. The medical personnel performing this follow-up had no a priori knowledge of the CPET results. The CPET variables were compared with other preoperative measures, including the resting cardiorespiratory status (echocardiogram), liver disease severity at assessment and at transplantation (MELD/UKELD), and donor variables, for the prognosis of early transplant survival. The secondary outcome measures were the length of the critical care stay and the length of the hospital stay. MELD exception points were not used in this study.
Mann-Whitney analyses (continuous, nonnormal distribution), t test analyses (continuous, normal distribution), and chi-square analyses (categorical) were used to compare the demographic and CPET variables of survivors and nonsurvivors. A simple regression analysis was used to determine the association between the cardiopulmonary reserve (as defined by AT) and the severity of liver disease (MELD/UKELD).
Logistic regression was performed to determine the univariate predictive value of both cardiorespiratory and perioperative variables; the primary binary dependent variable was 90-day mortality. A significance value of P < 0.1 was accepted for the addition of variables to a multivariate regression model. The accuracy of the subsequent model for outcome predictions was determined with a receiver operating characteristic (ROC) curve analysis. The optimum level in terms of sensitivity and specificity was determined by the choice of the value closest to the upper left corner of the ROC curve.
Cox regression was used to assess the value of preoperative CPET and perioperative variables as predictors of the intensive care unit length of stay and the hospital length of stay. Kaplan-Meier statistics were used to determine the mean values for the lengths of stay of 2 groups of patients who were classified according to an acceptable level of cardiovascular fitness (11 mL/minute/kg), which was based on the Weber classification20 and other perioperative studies.13, 14 The results are expressed as means and standard deviations unless otherwise stated.
One hundred eighty-two unselected liver transplant candidates underwent a preoperative assessment for liver transplantation. No patient refused an attempt at CPET, and there were no adverse incidents or ischemic changes that required a referral to a cardiologist. One hundred sixty-five patients (91%) exercised sufficiently to be able to reach a well-defined AT (Fig. 1). The mean AT in this group of patients was 11.3 mL/minute/kg (range = 5.6-20.2 mL/minute/kg). Only 1 of the 6 patients with severe liver disease (MELD score > 30 at assessment) was unable to complete a submaximal test. A regression analysis of all the assessed patients demonstrated only a weak association between the severity of liver disease (MELD scores) at the time of CPET and the cardiorespiratory reserve as defined by the AT (correlation coefficient r = −0.22; Fig. 2). For 11 of the 34 patients (32%) who required a dobutamine stress echocardiogram according to the standardized cardiovascular pre-assessment protocol, these tests were reported to be equivocal. Only 1 patient had a positive test. The corresponding exercise electrocardiogram that was performed during CPET failed to demonstrate any ischemic changes, and the subsequent coronary angiogram was normal.
Sixty-four of the 182 patients (35%) who were assessed and underwent CPET received a liver transplant. Sixty of these patients (94%) exercised sufficiently for AT calculations (4 patients had an indeterminate test, which was defined as a failure to reach AT). The overall 90-day mortality rate after transplantation was 10.0% (6/60). All 6 patients died in the intensive care unit, and the cause of all deaths according to case notes and death certificates was multiorgan failure. The nonsurvivors received a primary liver transplant for various diagnoses (3 for primary biliary cirrhosis, 1 for hepatitis B, 1 for primary sclerosing cholangitis, and 1 for autoimmune liver disease). The underlying diagnoses for the survivors were alcoholic liver disease (21), hepatocellular carcinoma (10), primary sclerosing cholangitis (7), primary biliary cirrhosis (5), viruses (5), nonalcoholic steatohepatitis (2), and other (4).
All transplant patients had normal left ventricular systolic function (ejection fraction > 45%) according to European cardiology guidelines.21 The age, body mass index, MELD and UKELD scores (at assessment and transplantation), time on the waiting list, and time between CPET and transplantation were not significantly different between the survivors and the nonsurvivors. Similarly, the donor characteristics were not significantly different between the 2 groups. The intraoperative times (the total operative, venovenous bypass, and warm ischemia times) were not different between the 2 groups. Patients who died had significantly more perioperative transfusions of fresh frozen plasma (FFP) and red blood cells than those that survived, and the expected significant correlation between the 2 variables was found (Table 1).
Table 1. Demographic Data for the Transplant Recipients: Survivors Versus Nonsurvivors
Survivors (n = 54)
Nonsurvivors (n = 6)
NOTE: The data are presented as means and standard deviations unless otherwise noted.
The ideal body weight was used for ascitic patients.
53.8 ± 10.4
49.2 ± 12.5
Body mass index (kg/m2)
26.2 ± 5.5
26.7 ± 6.9
MELD score at assessment
15.7 ± 6.0
17.8 ± 10.1
UKELD score at assessment
54.3 ± 5.6
53.5 ± 5.4
MELD score at transplantation
16.9 ± 8.3
18.2 ± 9.4
MELD score at transplantation (excluding hepatocellular carcinoma patients)
Early retransplantation for hepatic artery thrombosis (n)
The mean AT level for the survivors was 12.0 ± 2.4 mL/minute/kg, whereas the mean level for the nonsurvivors was 8.4 ± 1.3 mL/minute/kg (P < 0.001). Other variables that were derived from CPET and were related to the hemodynamic response to exercise (peak VO2, VE/VCO2, and VO2/heart rate) were impaired in the nonsurvivors, although there were no statistically significant differences (Table 1).
A univariate logistic regression analysis established donor age and blood and FFP transfusions during transplantation as significant perioperative predictor variables (Table 2). The AT value derived from CPET was also a significant univariate predictor of 90-day outcomes. These variables were introduced into the multivariate logistic regression model. In the multivariate analysis, only AT was retained as a significant predictor of outcome. An ROC analysis of this model demonstrated 90.7% sensitivity and 83.3% specificity with a high degree of accuracy [area under the receiver operating characteristic curve (AUROC) = 0.92, 95% confidence interval (CI) = 0.82-0.97, P = 0.001; Fig. 3]. The ROC analysis demonstrated that the optimal AT level for survival was >9.0 mL/minute/kg.
Table 2. Univariate and Multivariate Analyses for Predicting Survival
Odds Ratio (95% CI)
NOTE: There was a significant interaction between the amounts of blood and FFP that were administered.
Blood transfusions (units)
FFP transfusions (units)
Donor age (years)
Five patients had an AT level < 9.0 mL/minute/kg and survived. All these patients had refractory ascites as the primary reason for transplantation. Because the additional body weight associated with ascites acts as a confounding factor for the predictive value of AT (mL/minute/kg), we performed a secondary logistic regression analysis: AT was normalized to the ideal body weight (estimated with the Devine formula22) for patients with refractory ascites, but the actual weight calculations were kept for those patients without ascites. Donor age was also included in this model. With this approach, the predictive value and accuracy of the model were increased (AUROC = 0.98, 95% CI = 0.91-0.99, sensitivity = 98.2%, specificity = 83.3%, P < 0.001). The individual distribution of AT values (normalized to the ideal body weight) versus outcomes is presented in Fig. 4.
Cox regression demonstrated that AT and recipient age were significant predictors of the critical care length of stay (Table 3). With an AT value of 11 mL/minute/kg used as a submaximal marker of acceptable cardiorespiratory fitness in survivors, the mean length of stay in critical care was 2.8 ± 2.9 days for patients with acceptable fitness and 8.1 ± 10.1 days for patients with unacceptable fitness (log-rank test P < 0.001, hazard ratio = 0.35; Fig. 5). In contrast, there were no significant predictors of the hospital length of stay.
Table 3. Association Between AT and Length of Stay
AT < 11 mL/minute/kg (n = 20)
AT ≥ 11 mL/minute/kg (n = 40)
NOTE: The data are presented as means and standard deviations.
Critical care length of stay (days)
8.1 ± 10.1
2.8 ± 2.9
Hospital length of stay (days)
27.1 ± 12.7
25.4 ± 12.9
Three patients required early retransplantation for hepatic artery thrombosis within the immediate postoperative recovery period. Two of these 3 patients survived, and both had an AT in excess of 9.0 mL/minute/kg (12.9 and 14.4 mL/minute/kg). The patient who died had an AT of 6.3 mL/minute/kg. Four patients did not exercise sufficiently to reach their AT. All these patients survived, and 3 of the 4 patients reached a peak VO2 > 9.0 mL/minute/kg. Therefore, their AT values by definition would have been greater than 9.0 mL/minute/kg, and this would have placed them in a low-risk category.
The present study has demonstrated that CPET holds clinically meaningful information for understanding and potentially reducing the perioperative risk of liver transplantation. This is the first study to demonstrate the high positive predictive value of the cardiopulmonary reserve (AT) for early posttransplant survival. These results also demonstrate that reduced preoperative cardiopulmonary function in survivors is associated with an increased requirement for postoperative critical care. Furthermore, this study shows that CPET is a safe and well-tolerated method for assessing the cardiorespiratory reserve in patients who are being assessed for liver transplantation. In combination, these results suggest that the measurement of the cardiopulmonary reserve before liver transplantation may allow the identification of a population for which successful transplantation outcomes in terms of early mortality will be increased, and it may also minimize the duration of the critical care required after transplantation.
A preoperative assessment of the cardiorespiratory function of patients before liver transplantation provides important and potentially modifiable information about postoperative outcomes. Preoperative recipient factors are likely to be relevant to outcomes after transplantation, with comorbidities being important for predicting which patients will have an increased risk of late posttransplant mortality.23 The measurement of the cardiopulmonary reserve as an overall measure of cardiorespiratory fitness has been demonstrated to be important in determining the outcomes of patients undergoing other forms of major surgery both in the short term and in the medium term. A recent cohort study used logistic regression and ROC analysis to better understand the functionality of the cardiopulmonary reserve.15 This analysis showed a significant value of submaximal cardiopulmonary testing in the prediction of postoperative morbidity after major nontransplant surgery.
In the liver transplant population, the prediction of early postoperative mortality is particularly important. The appropriate allocation of limited donor organs and (more recently) marginal grafts is critical. The results of this study support previous studies that have investigated the use of CPET parameters in liver transplant patients and increase their clinical utility. The only other study evaluating the cardiopulmonary reserve in liver transplant patients used population-based normal values.16 These normal values have not been validated in hepatic or major surgical populations for the assessment of surgical risk, and this limits their use in routine care. In the present study, we chose to use logistic regression in combination with ROC curve analysis to investigate the predictive value of CPET for early liver transplant outcomes. We have demonstrated that in this cohort of patients, an AT value > 9.0 mL/minute/kg reliably predicted which patients had a low mortality risk. This was further improved when the ideal body weight was substituted for the actual body weight of patients with intractable severe ascites. The level of accuracy compares favorably with that of scores related to longer term mortality after transplantation (eg, MELD/UKELD scores).4 It is important to recognize that this value of 9.0 mL/minute/kg was determined from this cohort and was not set as an a priori measure of a high-risk or low-risk patient population. Interestingly, the 2 patients who did not conform to this AT cutoff value received organs with diametrically opposite donor ages. Because transfusion requirements (blood and FFP) and donor age were significantly related to outcomes in the univariate regression analysis, these variables may be relevant as potential markers of the severity of surgical procedures and donor organ quality, respectively. However, these 2 factors were not independently related to outcomes (as demonstrated by the multivariate analysis), and this further confirms the importance of preoperative cardiorespiratory function to outcomes.
An AT level of 11 mL/minute/kg has been used repeatedly to demonstrate high-risk groups among patients with chronic heart failure20 and in multiple studies of perioperative mortality.13, 14 In comparison with patients with good cardiorespiratory reserve, patients with reduced cardiorespiratory reserve spent a mean of 5.3 days more in critical care. The association of a prolonged critical care stay with patients with a reduced preoperative reserve is important in terms of informed consent and critical care resource allocation. Wherever postoperative complications are not measured in a structured manner, factors including social and discharge arrangements are consistent limiting factors in determining the hospital length of stay. Other important factors specific to transplantation that were not measured in this study, including immunosuppressive regimen stabilization, may explain the similarities in the lengths of the hospital stays of the present cohort. Despite the reduced detail in the present study, a higher AT seems to provide benefits that extend beyond survival alone and into the costs of managing patients after transplantation, and this should be further explored.
These findings may lead to further interventions to improve patients' cardiorespiratory reserve before transplantation. Patients who are identified as high-risk may benefit from the opportunity to improve their chances of survival through preoperative interventions such as alterations in drug therapy (increases or decreases in beta-blockers or statins), exercise therapy, and nutritional intervention. This includes the fast-tracking of low-risk patients through critical care areas and the focusing of outreach services onto high-risk patients once they are discharged from critical care.
There are limitations to the present study. Although the findings are significant, because of the low number of deaths, the results could be altered by further patient events. The information derived from this study would undoubtedly benefit from an evaluation in a larger multicenter trial. There have been other studies that have shown a number of variables associated with worse outcomes after liver transplantation.5-11 Interactions between variables that were not measured in this study and cardiorespiratory reserve warrant further investigation.
In summary, our results demonstrate that patients with good cardiopulmonary reserve (according to an exercise test before liver transplantation) have a higher survival rate and use less critical care resources postoperatively. The role of the cardiopulmonary reserve as a potentially modifiable risk factor in liver transplant candidates requires further investigation.
The authors acknowledge Chris O'Neil for performing CPET and Joyce Curwen and Sandra Latimer (liver transplant coordinators) for their help with collating the data.