Recently, Walker and colleagues1 published a retrospective dual-center study of the impact of serum ferritin (SF) on the mortality of candidates for orthotopic liver transplantation (OLT). They found baseline SF levels greater than 200 μg/L to be an independent predictor of waiting-list mortality, and they showed that the addition of SF to Model for End-Stage Liver Disease (MELD) parameters increased prognostic accuracy. Because additional factors to improve MELD-based organ allocation would be generally desirable, the authors proposed the incorporation of SF into the MELD-based allocation system (as currently discussed for serum sodium2).
However, elevated SF not only reflects increased hepatic iron deposition but also indicates iron accumulation in extrahepatic sites. For example, cardiac iron deposition was found in transvenous endomyocardial biopsy samples of 64% of patients with substantial hepatic iron staining.3
Hence, impaired iron homeostasis not only may be predictive of preoperative mortality but also may have a negative impact on the outcome after OLT. In order to study this question, a number of studies have compared the post-OLT survival of patients with normal iron contents and patients exhibiting hepatic iron overload in their explanted organs: Tung et al.4 reported significantly (P = 0.0009) reduced 5-year post-OLT survival of only 40% in 37 patients with hepatic iron overload versus 62% in age-matched controls. In 35 patients with hepatic iron overload (>40 μmol/g), the Queensland group5 found reduced 1- and 5-year unadjusted survival (P = 0.27) after OLT of 74% and 63% versus 80% and 72% in 178 patients with normal iron contents (<40 μmol/g). In a multicenter study including 235 patients with hepatic iron overload not associated with hereditary hemochromatosis, Kowdley et al.6 observed reduced 5-year post-OLT survival of 63% versus 72% in the overall population undergoing OLT (P = 0.003).
Although these data suggest that elevated SF before OLT may also affect posttransplant outcomes, no study has been published so far concerning this issue.
Therefore, we looked at all adult patients who underwent transplantation at the Hannover Medical School Transplant Center between January 1, 2004 and March 30, 2008. Patients with acute liver failure, living donor OLT, and combined liver-heart or liver-lung transplantation and nine patients with hemochromatosis were excluded. Of the remaining 346 patients, pretransplant SF levels were available for 92.2%. In a Kaplan-Meier analysis with a mean follow-up of 1535 days, we found significantly (P = 0.038, log-rank test) reduced survival (61.1% versus 71.9%) for patients with an SF level greater than or equal to 365 μg/L (Fig. 1).
To predict an optimal benefit from OLT, an ideal allocation parameter predicting waiting-list mortality would be expected to have a low impact on posttransplant mortality. SF represents a parameter reflecting a variety of clinical problems that are capable of limiting outcomes and that are likely not all remedied by OLT.
In summary, the available data and our experience suggest that with the use of SF as an additional allocation parameter, mortality may potentially be shifted to some degree to the period after OLT. Further studies should therefore analyze the influence of SF not only on waiting-list mortality but also on the posttransplant outcome.