Professor Dr A. K. Burroughs, Department of Liver Transplantation and Hepatobiliary Medicine, Royal Free Hospital, Pond Street, London NW3 2QG, UK. E-mail: firstname.lastname@example.org
Background and Aim The potential prognostic value for survival of nutritional status in cirrhotics after adjusting Child–Pugh classification and Model for End-Stage Liver Disease has not been evaluated.
Methods We used Kaplan–Meier and Cox proportional hazards regression models to identify factors associated with mortality in a cohort of 222 cirrhotics [M/F:145/77 median age 52 (18–68) years] with prospectively collected nutritional parameters as well as modified subjective global nutritional assessment, Royal Free Hospital-Subjective Global Assessment index. Follow-up was censored at the time of transplantation. Other variables were ones in Child–Pugh and Model for End-Stage Liver Disease scores, age, aetiology of cirrhosis and renal function,
Results Pretransplant mortality (Kaplan–Meier) was 21% by 2 years (135 patients were transplanted). Among the nutritional parameters, only Royal Free Hospital-Subjective Global Assessment remained significantly associated with mortality in multivariable models (P = 0.0006). The final model included the following variables: urea (P = 0.0001), Royal Free Hospital-Subjective Global Assessment (P = 0.003), age (P = 0.0001), Child–Pugh grade (P = 0.009) and prothrombin time (P = 0.003). The results were similar when the Child–Pugh grade was replaced by the Model for End-Stage Liver Disease score in the model, and whether a competing risks model was used.
Conclusions Nutritional indices add significantly to both Child–Pugh grade and Model for End-Stage Liver Disease scores when assessing the patient prognosis.
The Child–Pugh (CP) classification is a modification of the Child–Turcotte classification, which has been widely used as an index of disease severity for patients with end-stage liver disease since 1973.1 Whilst the original Child–Turcotte classification included nutritional status, this was replaced with prothrombin time (PT) in the CP classification.1–3 However, the presence of protein–calorie malnutrition has been shown to be associated with increased short- and long-term mortality in patients with acute and chronic liver disease.4–7 In Alberino's study5, malnutrition was shown to be an independent predictor of survival in which the inclusion of mid-arm muscle circumference (MAMC) and triceps skinfold thickness (TST) improved the prognostic accuracy of the CP score. Thus, it is likely that nutritional status could be a useful addition to the CP classification when assessing the prognosis of cirrhotic patients. Abbott et al.3 investigated the relationship between the CP classification and nutritional indicators and found that advanced CP classification was associated with diminished muscle status and greater early post-operative morbidity after liver transplantation.
Although the CP classification has been the most widely used model for assessing prognosis in cirrhotics, its use is limited as individuals with very similar laboratory markers may be classified very differently using this score, and because of the subjective nature of measures for quantitation of ascites and encephalopathy.1, 2 Thus, an alternative classification for assessing prognosis, the Model for End-Stage Liver Disease (MELD), has been proposed.2 MELD can discriminate more effectively between those who are likely to die and those who will survive at least 3 months.2, 8–10 Similar to the CP classification, MELD does not incorporate any measures of nutritional status.
The identification of an optimal method of nutritional assessment in patients with cirrhosis is difficult because many of the traditionally measured parameters of nutritional status, such as weight, anergy panels and biochemical values, vary with the severity of liver disease independently of nutritional status.4 Subjective Global Assessment (SGA) uses clinical criteria to determine nutritional status without the use of objective measurements. It is more useful than objective measures alone for identifying individuals at nutritional risk because of an ability to encompass the multitude of factors influencing the nutritional status.11–13 The SGA has been validated in liver transplant candidates.12, 13 We have evaluated a modified SGA, A Royal Free Hospital (RFH)-SGA index, which combines a subjective assessment of nutritional status with BMI, TST, MAMC and a subjective over-ride14, 15 In this index, patients were classified prospectively into three groups: (i) well nourished, (ii) mild or moderately malnourished or (iii) severely malnourished according to BMI, TST, MAMC and nutrient intake. Other factors, such as recent weight loss or severe steatorrhoea, which might have additional effects on nutritional status, are taken into account by incorporating a subjective override, which allows the assessor to change the nutritional class of a patient by a single category only. The potential prognostic value of the addition of nutritional status to either the CP or MELD classification is not known. We aimed to investigate whether nutritional status, as assessed using the RFH-SGA, provides additional information to either the CP and MELD scores when assessing the prognosis of cirrhotic patients.
A series of consecutive 222 hospitalized patients with chronic liver disease [145 (65%) male, 75 (35%) female, median age 52 (range 18–68) years] who were initially considered for orthotopic liver transplantation (OLT) at the Liver Transplantation and Hepatobiliary Unit of the RHF between 1994 and 2000 were included in the study. The diagnosis of cirrhosis was based on the medical history, physical examination, clinical-biochemical findings and liver biopsy.
The aetiology of liver disease was alcoholic in 84, chronic hepatitis B (CHB) in 19, chronic hepatitis C (CHC) in 50, primary biliary cirrhosis in 28, primary sclerosing cholangitis in 12, CHC and alcoholic in 10 autoimmune cirrhosis in six, cryptogenic in 13. Hepatocellular carcinoma (HCC) superimposed on cirrhosis was diagnosed in 37 patients (12 of them alcoholic, 10 had CHC and 15 had CHB).
All patients had a systematic work up on admission, all data were collected prospectively, and the analysis performed on this data retrospectively. Clinical variables recorded were degree of encephalopathy (none, grade 1–2, grade 3–4), degree of ascites (absent, slight or moderate) based on the clinical or ultrasound data and a history of bleeding varices. The laboratory data collected included bilirubin, albumin, PT, International Normalized Ratio (INR), urea and creatinine, which were used to calculate the CP and MELD scores when the patients were haemodynamically stable; all markers were measured by established standard laboratory methods.
The standard SGA comprised a nutritionist evaluation of height, weight (current, before illness, and weight range in the previous 6 months), nutritional history (appetite, intake, gastrointestinal symptoms), physical appearance (subjective assessment of fat loss, muscle wasting, edema and ascites) and existing conditions (encephalopathy, infections, renal insufficiency). Based on this evaluation, patients were classified prospectively into three groups: (i) well nourished, (ii) mild or moderately malnourished or (iii) severely malnourished. Recent dietary intake was assessed using an established diet history method16 supplemented, where necessary, with additional information from relatives, nursing staff and food record sheets. Details of dietary restrictions and oral, enteral or parenteral nutritional support were recorded. The data obtained were not intended to provide a quantitative evaluation of intake but rather to give an idea of the overall adequacy of the diet in relation to estimated requirements, which were assessed using Schofield's modification of the Harris–Benedict equations.17, 18 Intakes were categorized as adequate if they met estimated requirements, inadequate if they failed to meet estimated requirements but exceeded 500 kcal/day, or negligible if they provided <500 kcal daily.
A detailed nutritional assessment was performed by a single dietitian, specialized in liver disease (SJ), and a separate record of the nutritional assessment was kept, in addition to the hospital notes. Nutritional assessment included dry weight [estimated by substracting paracentesis (total) volume, 1 L = 1 k], BMI, anthropometric indices (TST, mid-arm circumference [MAC] and MAMC),19, 20 body fat, weight change, dietary intake, the use of a dietary supplement and the RFH-SGA. Reference values were derived from Bishop's study21 according to age and gender. An RFH-SGA index adds BMI, TST, MAMC to the standard SGA, which are all variables which can be measured objectively reproducibly. In this index, patients were classified prospectively into three groups as well nourished, mild or moderately malnourished, severely malnourished like the standard SGA. We summarized that how we classified the patients into three groups according to RFH-SGA in Figure 1. Other factors, such as recent weight loss or severe steatorrhoea, which might have additional effects on nutritional status, were taken into account by incorporating a subjective override, which allows the assessor to change the nutritional class of a patient only by a single category, within the three described above.14, 15
The CP classification was determined by the method outlined by Pugh et al.1 and the MELD score was calculated as 6.43 + 9.57 × loge (creatinine) + 3.75 × loge (bilirubin) + 11.2 × loge (INR) and was rounded to the nearest integer.9
The primary endpoint of this study was all-cause mortality. For the analyses reported in this paper, patient follow-up is considered from the date of assessment until the date of death, transplantation or last follow-up visit, as applicable. In the analyses, we have chosen to use the Cox proportional hazards regression model. Factors associated with survival in univariable analyses were identified using Cox Proportional Hazards regression analysis. This approach considers outcome over each patient's entire follow-up, rather than arbitrarily classifying patients at a single time point (e.g. 3 months, 1 year) as is often done in logistic regression analyses. All factors associated with survival (P < 0.1) in univariable analyses were then included in a multivariable model using a two-stage selection procedure. Initially, demographic and liver-related parameters were selected for the inclusion in the final model using a backwards selection procedure to identify the factors that were independently associated with survival. Once the model had been selected, each of the nutritional parameters was tested to see whether their inclusion significantly improved the fit of the model. All analyses were performed using the PHREG procedure in the statistical software package sas. Demographic factors considered were age (continuous), gender and aetiology of disease. Severity of liver disease was assessed using bilirubin, albumin, INR, PT, CP score, urea, creatinine (all continuous markers), ascites (none, slight or moderate), encephalopathy (none/grade 1–2/grade 3–4), CP classification (A/B/C) and a history of bleeding varices (no/yes). Finally, the nutritional parameters evaluated were weight, dry weight, height (estimated), BMI, MAC, MAMC, TST, weight change (all continuous) and body fat (adequate/inadequate), dietary intake (adequate/inadequate/negligible), special diet (whether given advice on increasing calorie and/or protein intake) (no/yes), dietary supplement (no/yes) and RFH-SGA (well nourished/mild or moderately malnourished/severely malnourished).
One of the limitations of studying mortality in a cohort such as this is that a large number of patients will undergo transplantation, which may then affect any relationships between potential prognostic markers and survival. In this analysis, in order to remove any effects of transplantation, all patient follow-up was right-censored at the time of transplant – thus, we have considered the cause-specific hazard, the analysis focuses on pretransplant mortality only and any deaths that occur after transplant are not included. However, this approach may underestimate the mortality rate in the group, as a high proportion of those who undergo transplantation would be likely to have died had they not been transplanted (thus transplantation acts as a ‘competing risk’ for mortality and the assumption of non-informative censoring may be violated). Use of a combined endpoint of death or transplant in this setting is of limited value as the factors associated with transplant and mortality will differ and the results will be difficult to interpret. An alternative approach to the analysis that attempts to take account of competing risks is to censor follow-up of those undergoing transplant not at the time of transplant, but at the end of the study;22 this is known as a competing risks model. The analyses were repeated using this approach; as the results were similar, only the results from the cause-specific models have been reported.
Basic clinical details and nutritional parameters measured on the patients are summarized in Tables 1 and 2, respectively. Overall, 91 (42%) of the patients were well nourished, 90 (40%) were mildly or moderately malnourished and 37 (17%) were severely malnourished according to the RFH-SGA. Two of these because of massive weight loss who were moderately malnourished were classified as severely malnourished because of the subjective override. Individuals who were severely malnourished were more likely to have moderate ascites (P = 0.0002) and higher CP scores (P = 0.007) compared with those who were well- or only mild/moderately malnourished. Those with mild/moderate malnourishment were most likely to have grades 1–2 encephalopathy (P = 0.02), but there were no significant differences between the groups with respect to any of the other clinical parameters measured (P > 0.05 in each case). As expected, global nutritional status was significantly associated with all of the other nutritional parameters measured (P = 0.0001 for all variables), with the exception of height (P = 0.76) and whether or not the patient was receiving a special diet (P = 0.14).
Table 1. Clinical details of patients at initial assessment
Royal Free Hospital (RFH)-Subjective Global Assessment (SGA)
Values included in table are median (range) and n (%), as appropriate.
Table 2. Nutritional parameters measured at initial assessment
* Values included in table are median (range) and n (%), as appropriate.
Number of patients
Dry weight (estimated)
Body mass index (kg/m2)
Mid-arm circumference (cm)
Mid-arm muscle circumference (cm)
Triceps skinfold thickness (mm)
−1 (−9, 2)
Receiving special diet:
Receiving diet supplement:
Over the study period,135 patients (61%) underwent OLT with a median interval of 146 days. Vital status was known for 217 individuals at the end of follow-up, of whom 48 (22%) had died and five patients were lost to the follow-up. The median (range) time to death in these 48 patients was 155 (5–1825) days. Thirty-one deaths occurred in individuals who had not undergone OLT. In all patients, the last follow-up (or transplantation/death) occurred a median of 636 (0–2220) days after the assessment. Fifty-one patients were alive at the last follow-up without transplantation and 24 of them had alcoholic liver disease and nearly 50 % had abstained during the follow-up. Overall, the Kaplan–Meier death rate pretransplant was 19%, 21% and 41% by 1, 2 and 6 years after initial assessment, respectively.
In univariable analyses of the non-nutritional parameters, older age [relative hazard (RH) 1.42 per 5 years older, 95% CI (1.19–1.69), P = 0.0001], presence of HCC [2.56 (0.99–6.63) P = 0.05], higher bilirubin [1.06 (1.02–1.09) per mg/dL higher, P = 0.0003], lower albumin [0.91 (0.86–0.96) per g/dL higher, P = 0.001], higher INR [1.15 (1.06–1.26) per unit higher, P = 0.0009], increased PT [1.10 (1.05–1.14) per second, P = 0.0001], higher urea [1.19 (1.13–1.25) per mm, P = 0.0001] and higher creatinine [8.76 (4.05–18.94) per mg/dL higher, P = 0.0001] were all associated with shorter survival. In addition, the presence of either slight [8.2 (1.06–63.58), P = 0.04] or moderate [13.6 (1.82–101.65), P = 0.01] ascites and grade 3–4 encephalopathy [3.73 (1.25–11.18), P = 0.02] were also associated with shorter survival. Both the CP score [1.42 (1.2–1.67) per unit increase, P = 0.0001] and CP grade [7.00 (2.56–19.14) per increase in stage, P = 0.0002] were associated with shorter survival, but a history of bleeding varices and aetiology of liver disease were not significantly associated with the shorter survival. Multivariable analyses of the non-nutritional parameters identified increased CP grade, increased urea, older age and longer PT as being independently associated with poorer survival (Table 3). Creatinine levels were also significantly associated with survival, but because of the strong collinearity between these and urea levels, creatinine levels were not included in the final model because of the variability in creatinine measurements with high bilirubin values.23 The results from models that included the CP score rather than the grade were similar.
Table 3. Factors independently associated with mortality from multivariable models
Model including age and liver parameters only
CP score (per unit higher)
Urea (per mm higher)
Age (per 5 years older)
PT (per second higher)
Model including age, liver parameters and nutritional status
CP score (per unit higher)
Urea (per mm higher)
Age (per 5 years older)
PT (per second higher)
RFH-SGA (well nourished)
In univariable analyses of the relationships between nutritional parameters and mortality, only weight (weight with ascites) [RH 1.04 (1.00–1.07) per kg greater, P = 0.04], height [1.66 (0.98–2.8) per metre taller, P = 0.06], adequate body fat [0.26 (0.08–0.83), P = 0.02], the receipt of a dietary supplement [3.32 (1.35–8.15), P = 0.009] and the RFH-SGA [3.05 (1.01–9.21) for those with mild/moderate malnourishment, and 6.39 (2.03–20.15) for those with severe malnourishment, P = 0.003] were associated with survival. Of these, only the RFH-SGA added significantly to the multivariable model containing the liver parameters (Table 3) with a poorer prognosis in those with the severest malnourishment [adjusted RH 5.26 (1.55–7.86)]. After adjusting for nutritional status, the RH associated with the CP grade was reduced although remained significant, suggesting that some of its apparent relationship with CP grade could be explained by the poorer nutritional status in those with worse CP grades. However, all other parameters in the model were only marginally changed and remained significantly associated with mortality after adjusting for nutritional status. Cumulative mortality rate (transplantation was a censoring point), stratified by global RFH-SGA, was evaluated by a Kaplan–Meier plot (Figure 2).
When the analysis was repeated replacing the CP grade with the MELD score in the model, very similar results were obtained for the RFH-SGA [mild/moderate malnourishment: 2.1 (0.68–6.49), severe malnourishment: 8.29 (2.5–27.45), P = 0.0001]. Whilst PT was no longer associated with survival in this model, the MELD score [1.12 (1.07–1.17), P = 0.0001], urea [1.08 (1.01–1.14), P = 0.02] and older age [1.65 (1.32–2.06), P = 0.001] were all significantly associated with mortality in this model. Very similar results were obtained when the analyses were repeated after censoring those who underwent transplant at the end of the study (rather than at the time of transplant) i.e. competing risks model.
Nutritional assessment as defined by a validated protocol14, 15 provided additional prognostic information on outcome in cirrhotic patients to that provided by routinely measured clinical indicators (including CP grade, urea and PT). Specific individual nutritional variables, which have previously been found not to correlate as well in cirrhotic patients with malnutrition compared with other groups,4, 24, 25 did not significantly add to the prognostic information provided by the liver parameters, but were captured by SGA, which is really a ‘nutritional review’ and has been shown to be more useful in identifying at risk of malnutrition.12, 13 Although the assessment of this parameter is equally as subjective as that of ascites and encephalopathy, it is clinically recognized that malnutrition is important in the prognosis of cirrhotic patients,5–7, 26, 27 as well as in the outcome of liver transplantation.3, 4, 26 Thus, SGA was an independent predictor of outcome after transplantation,13 and SGA as well as serum proteins,27 correlated with the severity of liver disease. Handgrip strength and MAMC were crucial nutritional parameters in detecting body cell mass depletion that has been associated with adverse outcomes in patients with end-stage liver disease.6, 26, with handgrip strength being a better predictor than standard SGA or the prognostic nutritional index. Lower pre-operative handgrip strength was also associated with poor survival after OLT4 Abbott et al.3 found that a reduced muscle mass was associated with the advanced CP classification and an increased and earlier post-OLT morbidity. The addition of MAMC and TSF to the CP score in cirrhosis improved its prognostic accuracy in one study.5 One study reported that Shaw's risk score (later contained a nutrition assessment) did not influence survival at 6 months after liver transplantation.28 Another had a weak but statistically significant correlation between death and MAC after liver transplantation.29
Although CP classification has been the most widely used tool for prognosis in cirrhotic patients, it uses discrete cut-offs to move from one class to the next resulting in limited discriminatory ability.2, 8, 30 Even when the CP score, as opposed to CP grade, is used, there are only 10 different points between the least sick (CP score = 5) and the most advanced (CP score = 15) potential transplant candidates.30 In contrast, the MELD score that uses continuous measurements of variables may provide better discrimination of mortality over a 3-month period.2, 8, 30 Our analyses were repeated using the MELD score in place of the CP grade with similar results.
In this study, both urea and creatinine, known markers of renal function, also added prognostic power to the CP classification and RFH-SGA. In Papatheodoridis’ study, creatinine-modified CP was better than CP alone but creatinine-modified CP was similar to MELD for predicting survival in patients with decompensated cirrhosis.31 Renal function is a well recognized predictor of survival in patients with liver disease and outcome following the liver transplantation.22, 32–35 Nair et al.35 found that creatinine clearance <40 mL/min at the time of transplant was associated with significantly lower short- and long-term graft and patient survival rate. Whilst the MELD score also incorporates serum creatinine level as a parameter of disease severity index for patients with chronic liver diseases, we found that urea levels were also independently associated with outcome in analyses that incorporated the MELD score rather than CP grade. Our study therefore confirms the importance of renal function for the prediction of survival of patients with chronic liver disease.
There are some limitations that should be noted with our study. Firstly, the RFH-SGA contains a subjective measurement and its assessment, as with ascites and encephalopathy may differ amongst individual clinicians or centres. Nevertheless, the recognition of severe malnutrition vs. other states by the standard SGA does have reasonable discrimination, and has been validated in liver transplant candidates12, 13 which is the cohort in which nutrition is usually assessed as a routine. Thus, despite the subjective nature part of the RFH-SGA index, this study does provide evidence for nutrition as an independent prognostic parameter in late stage cirrhosis and should stimulate therapeutic studies in this area, including its significance for survival after transplantation. These studies would also be needed to test the generalizability of the RFH-SGA index when compared with the standard SGA index,12, 13 although the added parameters of BMI, TST and MAMC (RFH-SGA) are not subjective measurements. In addition, as MAMC is a product of MAC and skin thickness, there may be little to gain by using both TST and MAMC, but both were included in the original published RFH-SGA index,14, 15 which we used unchanged.
The second limitation is that the RFH-SGA index achieved statistical significance in the multivariable analysis only in the severely malnourished group and not in the less severely malnourished group, in which other variables appear to have ‘sufficient’ prognostic value. However, the significant association in the univariable analysis would suggest that statistical effect is related to sample size, but this will need evaluation in a validation cohort and it could be that with new reference values for the general population (those in Bishop's study21 are from 25 years ago), all the cirrhotics would appear more malnourished.
The third limitation is that our cohort was not sufficiently large to randomly split into a training sample and a test sample . Thus, our model has not been validated in this paper, but the fact that both the models censoring at the time of transplantation and competing risks model yield similar results makes it robust. Nevertheless, validation of these findings is particularly important, as the value of the RFH-SGA index may vary from centre to centre. Thus, we encourage other authors to validate our findings in their own patient groups, particularly where these patients are in different settings. We could not find any correlation between aetiology of liver disease and survival. Our cohort of alcoholic cirrhotics (42%) were active drinkers close to the first evaluation. Some became abstinent during the follow-up, which could not be taken into account, as only variables at initial assessment, or that were present previously were evaluated. However, despite the fact that these abstainers may well have improved their nutritional status with time, and thus did not fulfil an adverse prognosis dictated by their worse baseline nutritional assessment, our models showed that baseline assessment of nutrition gave added prognostic value.
In conclusion, nutritional indices, particularly RFH-SGA, as well as creatinine and urea, add significantly to existing measured markers of prognosis, such as the CP grade and MELD when assessing prognosis in cirrhotic patients. We suggest that future studies assess nutritional indices with CP criteria and creatinine and also urea and MELD together as prognostic indicators for end-stage liver disease.