Management of major trauma haemorrhage: treatment priorities and controversies


Ross Davenport, Trauma Clinical Academic Unit, The Royal London Hospital, Whitechapel Road, London E1 1BB, UK. E-mail:


The severely injured trauma patient often arrives in the emergency department bleeding, coagulopathic and in need of a blood transfusion. The diagnosis and management of these patients has vastly improved with a better understanding of acute traumatic coagulopathy (ATC). In the emergency setting, traditional laboratory coagulation screens are of limited use in the diagnosis and management of life-threatening bleeding. Whole blood assays, such as thrombelastography (TEG) and rotational thrombelastometry (ROTEM) provide a rapid evaluation of clot formation, strength and lysis. Rapid diagnosis of ATC and aggressive haemostatic transfusion strategies utilizing early high doses of plasma are associated with improved outcomes in trauma. At present there is no accurate guide for transfusion in trauma, therefore blood and clotting products are administered on an empiric basis. Targeted transfusion therapy for major trauma haemorrhage based on comprehensive and rapid measures of coagulation e.g. TEG/ROTEM may lead to improved outcomes while optimizing blood utilization. Evidence for the clinical application of TEG and ROTEM in trauma is emerging with a number of studies evaluating their ability to diagnose coagulopathy early and facilitate goal-directed transfusion. This review explores current controversies and best practice in the diagnosis and management of major haemorrhage in trauma.

Uncontrolled haemorrhage is responsible for 40% of early in-hospital trauma mortality (Sauaia et al, 1995) and is a principle cause of preventable death (Gruen et al, 2006; Kauvar et al, 2006). Up to 25% of severely injured patients arrive at hospital with significant derangements in blood coagulation (Hess et al, 2008). Patients presenting with this Acute Traumatic Coagulopathy (ATC) have a mortality of 19–62% and significantly greater transfusion requirements, organ injury, septic complications and critical care stay (Brohi et al, 2003, 2007a; MacLeod et al, 2003; Maegele et al, 2007; Frith et al, 2010). Management strategies targeting ATC may allow significant improvement in outcomes (Gunter et al, 2008; Holcomb et al, 2008; Maegele et al, 2008). These ‘damage control’ resuscitation approaches require early identification of ATC to allow rapid activation of transfusion protocols. In the absence of a rapidly available diagnostics, current ATC management relies on empiric transfusion strategies that are activated on the basis of clinical surrogates of haemorrhage or physician gestalt (Malone et al, 2006; Geeraedts et al, 2007; Holcomb et al, 2007).

Mechanism of haemorrhage following severe trauma

Active haemorrhage from direct injury to major blood vessels and organs produces hypovolaemic shock and exsanguination if not treated immediately. Trauma-Induced Coagulopathy (TIC) is multi-factorial in origin and includes processes such as dilution, acidaemia and hypothermia (Schreiber, 2005) (Fig 1). Recent studies have identified an early endogenous coagulopathy (ATC), which appears to be driven by the combinations of tissue trauma and systemic hypoperfusion (Hess et al, 2008; Chesebro et al, 2009; Frith et al, 2010). ATC is characterized by systemic anticoagulation and hyperfibrinolysis, putatively through endothelial activation of Protein C (Brohi et al, 2007b, 2008; Davenport et al, 2011a). Uncontrolled haemorrhage with associated physiological disturbances, e.g. hypothermia and acidosis complicated by the dilutional effects of intravenous fluid resuscitation, and ongoing blood loss may exacerbate ATC and produce TIC (Fig 1).

Figure 1.

 Multi-factorial drivers of trauma-induced coagulopathy. Acute Traumatic Coagulopathy (ATC).

Diagnostic challenges of acute traumatic coagulopathy

At present there are no rapidly available and validated coagulation assays that can reliably diagnose coagulopathy in the bleeding trauma patient (Segal & Dzik, 2005; Toulon et al, 2009; Curry et al, 2011). Traditional coagulation screening tests, such as prothrombin time (PT), partial thromboplastin time (PTT), platelet count and fibrinogen levels, are of limited value in acute haemorrhage. Laboratory-based assays in platelet-poor plasma (PT, PTT, fibrinogen) require standard processing, with results often not being available to the treating clinician for in excess of 30 min (Toulon et al, 2009; Davenport et al, 2011b). PT and PTT provide only partial information on clot initiation (Mann et al, 2003) and neither can quantify the relative activity of pro-coagulants versus anti-coagulants (Tripodi et al, 2009). Platelet counts are often normal in trauma and fail to provide any measure of the platelet dysfunction secondary to the physiological derangement evident in TIC (Hess et al, 2008). The Clauss fibrinogen assay is unable to assess the cross-linking of polymerized fibrin by factor XIII. None of these assays can evaluate the rate of clot propagation versus clot lysis or overall clot strength.

Conventional coagulation screens are poor predictors of the need for massive transfusion (MT) and have limited ability to direct on-going blood component therapy (Dzik, 2004; Kashuk et al, 2010; Davenport et al, 2011b). Most published guidelines for the diagnosis and management of major haemorrhage are based on arbitrary laboratory triggers for fresh frozen plasma (FFP) administration e.g. PT >1·5 × normal (Stainsby et al, 2006; Rossaint et al, 2010). More recent work has demonstrated a PT ratio (PTr) >1·2 is associated with worse outcomes and higher transfusion requirements and therefore represents a more clinically relevant cut-off for the diagnosis of ATC (Frith et al, 2010). However, coagulation screens have questionable diagnostic sensitivity (Lucas & Ledgerwood, 1981; Ciavarella et al, 1987; Yuan et al, 2007) and predictive power (Miller et al, 1971; Counts et al, 1979) to guide transfusion, particularly when blood loss is rapid. The need for a global assessment of the clotting process in trauma patients has renewed an interest in viscoelastic coagulation tests (VCTs) of whole blood, specifically thrombelastography and rotational thrombelastometry (Johansson et al, 2009).

Point-of-care diagnostics

Both Thrombelastography (TEG®; Haemoscope Corp, Niles, IL, USA) and Rotational Thrombelastometry (ROTEM®; Pentapharm GmbH, Munich, Germany) assess the viscoelastic properties of blood samples under low shear conditions to provide a graphical representation of clot formation and fibrinolysis. VCTs measure the ‘modulus of elasticity’ or ‘clot firmness’ during the transition of blood from a fluid to semi-solid state and exhibit good correlation with thrombin generation assays (Rivard et al, 2005; Johansson et al, 2008). Blood is incubated at 37°C in a series of heated cups with activators of coagulation (e.g. tissue factor, ellagic acid, kaolin). The relative contribution of fibrinogen and platelets to clot strength can be evaluated through inhibition of thrombocytes in ROTEM (FIBTEM) or via agonists in TEG (Platelet Mapping). In TEG, a pin attached to a torsion wire is immersed in an oscillating cup containing the blood sample. In ROTEM the cup is stationary and the oscillating pin is attached to an optical detector. As clot forms, the gap between cup and pin is bridged and the oscillation is transmitted from the cup to the pin (TEG), or impedes rotation of the pin (ROTEM). Motion is detected and a characteristic trace generated (Fig 2).

Figure 2.

 Haemostatic profile generated by TEG or ROTEM. Main viscoelastic coagulation test (VCT) parameters: (A) r time or clotting time (CT); (B) K time or clot firmness time (CFT); alpha angle (α); clot amplitude (CA) at fixed time points e.g. 10 min (CA10); maximum clot (MA) or maximum clot firmness (MCF); clot lysis (CL) or lysis (LY) at fixed time e.g. 45 min (CL45 or LY45).

For more than 20 years, VCTs have been utilized in both cardiac surgery and liver transplantation to detect abnormal coagulation and guide transfusion (Luddington, 2005). More recent studies have evaluated the role of thrombelastography in bleeding trauma patients (Johansson et al, 2009; Leemann et al, 2010; Davenport et al, 2011b,c). Simpler tests of coagulation, such as Point of Care (PoC) PT instruments, have been appraised in trauma and bleeding surgical patients but have been found to have limited sensitivity at transfusion threshold values e.g. PTr >1·5 (Toulon et al, 2009). In a prospective study of 300 trauma patients there was reduced accuracy in patients with ATC (PTr >1·2) and only 71% agreement between laboratory and point of care PT values (Davenport et al, 2011b). The discrepancy between laboratory and PoC measures increases as the haematocrit falls, limiting the application of PoC PT measures in trauma haemorrhage (Davenport et al, 2011b).

Both VCTs can be operated by trained clinical staff and are able to deliver a dynamic display of clot initiation, quality and lysis. Maximum clot firmness is realized within 30–45 min but clot initiation and propagation parameters are obtained within a few minutes. VCT results are available in real-time at the bedside enabling the clinician to modify therapy accordingly. Intra-operator variability is low for both devices. One small UK multicentre study of VCTs with lyophilized plasma found TEG to have superior repeatability compared to ROTEM for CT and CFT (Jennings et al, 2007). These findings were not supported by a larger multinational ROTEM study of whole blood, which reported good precision (coefficient of variation <15%) for all test measurements (Lang et al, 2005) and <6% for a single centre study (Theusinger et al, 2010). Factors for consideration with any near-patient testing are training for clinical staff, frequency of maintenance and the need for robust quality assurance. VCT users need to fully engage with external quality assessment (EQA) providers, such as the UK National External Quality Assessment Scheme (NEQAS), to enable performance monitoring of the devices. In addition, suitable material must be made available by EQA providers to facilitate this process.

High staff turnovers and a relatively low incidence of TIC in some centres may limit clinician competency in the timely analysis of VCT graphs for bleeding trauma patients. Major trauma centres that treat a high number of severely injured patients are likely to develop expertise in the VCT-guided management and diagnosis of TIC similar to that seen in liver and cardiac units. Some hospitals have adopted a remote system, utilizing trained technicians in the laboratory to perform VCT (Johansson, 2007) with results transmitted in real time to monitors in the clinical environment for correlation. Educational programmes with training certification and telephone support from transfusion experts have been shown to assist physicians in the interpretation of VCT and the clinical implications of abnormal results (Johansson, 2007).

A major limitation of TEG in the emergency setting has been the sensitivity of the torsion pin to vibration and shock. The oscillating pin in ROTEM is believed to confer greater stability to the device, such that the UK Defence Medical Services have been able to deploy a machine in a forward medical unit in Afghanistan (Camp Bastion) with good effect (Doran et al, 2010). Central laboratory VCT processing provides the opportunity for high standards of quality control. The disadvantage is the inevitable delay in result availability, although this can be mitigated by satellite units within the Emergency Department (ED) or operating theatres. Remote VCT aids interpretation for the treating clinician and overcomes quality assurance problems of near-patient testing.

Management principles for life-threatening traumatic haemorrhage

Initial treatment of major trauma haemorrhage is directed towards restoring end organ perfusion; optimizing ventilation and oxygenation; correcting coagulation status and rapid control of haemorrhage – splintage, compression (limb tourniquets), surgical or endovascular procedures. Large volume resuscitation is associated with a transient rise in blood pressure, which may dislodge tenuous clot, precipitate dilutional coagulopathy and compound the hypothermic effects of intravenous fluid replacement. Precise resuscitation targets for patients without traumatic brain injury are controversial; for penetrating trauma (knife, gun) it is reasonable to aim for a systolic blood pressure (SBP) that is associated with cerebration (70–90 mmHg) until haemostasis is achieved (permissive or hypotensive resuscitation) (Bickell et al, 1994). Although the evidence base is weaker for bluntly injured shocked patients (Dutton et al, 2002), a similar approach is justified at least until definitive haemorrhage control is achieved.

Major trauma haemorrhage is typically associated with profound physiological derangement – tissue hypoxia, hypothermia, acidosis and coagulopathy. The conventional surgical approach of ‘early total care’ with anatomical fixation of all injuries may further exacerbate these problems and lead to a fully repaired but deceased patient (Shapiro et al, 2000). As a result, operative treatment of the multiply injured patient has undergone a paradigm shift in the last 25 years, with trauma surgeons utilizing a ‘damage control’ strategy. This is defined by an abbreviated resuscitative surgical procedure i.e. laparotomy consisting of rapid control of haemorrhage and contamination, temporary closure and resuscitation to normal physiology (in intensive care) before return to the operating theatre for definitive repair (Johnson et al, 2001). More recently the term Damage Control Resuscitation (DCR) has been used to describe aggressive control of hypothermia, acidosis and coagulopathy with early and high dose administration of clotting products (Holcomb et al, 2007). Outcomes in severely injured and shocked patients are superior where damage control surgery and resuscitation are utilized in combination (Duchesne et al, 2010) but requires a collaborative approach from specialists in trauma surgery, critical care, anaesthesia and transfusion medicine to facilitate a rapid reversal of abnormal physiology.

Viscoelastic coagulation tests for diagnosis of ATC

The speed of detection, whole blood assay and evaluation of clot dynamics make VCT an attractive option for the rapid identification of ATC. Trauma patients with ATC (defined as PTr >1·2) have a ‘signature’ thrombelastogram (Fig 3A). Compared to patients with PTr ≤1·2 the ATC trace is characterized by a reduction in clot strength with much smaller changes in laboratory clotting times and can be identified within 5 min – threshold of clot amplitude at 5 min (CA5) ≤35 mm (Davenport et al, 2011b). ATC appears to be primarily a problem with clot strength adding further weight to the case against using traditional clotting screens that only measure the initial stages of fibrin formation. Earlier animal and human studies comparing VCTs with coagulation screens have shown similar correlations. In a porcine model of hypothermia and haemorrhage, TEG was shown to be superior to conventional clotting assays in differentiating clotting abnormalities (Martini et al, 2008). PT and PTT were less sensitive whereas Activated Clotting Time was not specific in detecting coagulation defects in comparison to TEG. Hagemo et al (2010) directly compared TEG with ROTEM in a small group of trauma patients in the ED. The authors found poor correlation between the two devices and ROTEM parameters were better correlated with standard clotting assays (Clotting Time with International Normalized Ratio; Clot Formation Time with fibrinogen and PTT; alpha angle with platelets and fibrinogen). Platelet count was well correlated with both TEG maximum amplitude (MA) (r = 0·44) and ROTEM maximum clot firmness (MCF) (r = 0·53) in the two analysers.

Figure 3.

 Coagulopathic traces produced by ROTEM in acute traumatic coagulopathy (ATC) (A) ATC signature trace. Comparison of ROTEM traces from 300 trauma patients with PTr >1·2 (ATC) vs. PTr ≤1·2 (non-ATC) Davenport et al (2011c). (B) Hypocoagulability (C) Hyperfibrinolysis. (B, C) Traces seen in ATC demonstrating slower clot formation and reduced clot strength (B, hypocoagulability) and rapid clot breakdown (C, hyperfibrinolysis).

Doran et al (2010) showed, in a small cohort of severely injured combat casualties, that admission samples analysed by ROTEM were more sensitive than PTT and PT in detecting laboratory defined coagulopathy [PT >18 s, activated partial thromboplastin time (APTT) >60 s]. In patients who received a MT, defined as 10+ units of packed red blood cells (PRBCs), the PT or PTT was abnormal in 21% of cases compared with 64% for ROTEM. Using British National Blood Service Guidelines (Stainsby et al, 2006) only 10% of patients who required a MT had prolongation of PT or PTT, which met diagnostic criteria for coagulopathy. Of note the authors highlighted the potential use of initial ROTEM parameters as early indicators of poor clot strength. Seven out of nine patients with an abnormal MCF had reduced clot amplitude at 10 min (CA10) and no patient with a low CA10 had a normal MCF.

An earlier larger study of 90 civilian trauma patients with 28% incidence of ATC (PT >1·5 × normal) (Rugeri et al, 2007) showed that ROTEM can detect early coagulation abnormalities in the severely injured with hypocoagulable traces (Fig 3B). The authors reported a significant relationship between PT and clot amplitude at 15 min and between APTT and the maximum rate of clot formation. In addition there was good correlation between fibrinogen level and CA10 in the platelet-inhibited assay (FIBTEM). Further, investigators were able to show evidence of hyperfibrinolysis (Fig 3C) with a concomitant increase in D dimers within a subgroup of the coagulopathic patients (Levrat et al, 2008). The investigators reported good correlation between euglobulin lysis times (a crude marker of clot breakdown) and ROTEM parameters of clot strength and lysis in five patients.

A larger study that analysed the ROTEM pattern of hyperfibrinolysis (clot lysis >15%) in 33 trauma patients found an association between degree of clot lysis, injury severity and mortality (Schochl et al, 2009). Fulminant hyperfibrinolysis (complete clot breakdown <30 min) can be detected early but intermediate or delayed hyperfibrinolysis may take 30–60 min to be visible. In the absence of routine plasma assays of elevated fibrinolytic activity, these findings are of interest. Further research is required to identify initial test parameters associated with hyperfibrinolysis if VCTs are to serve as early triggers for anti-fibrinolytic therapy. Early coagulopathy following severe injury appears to give rise to a hypocoagulable VCT trace with evidence of excessive activation of the lytic pathway in select cases although the true incidence of hyperfibrinolysis in ATC has yet to be established.

The strongest and most consistent evidence comparing VCT analysis and routine laboratory tests is the correlation between clot strength – MA/MCF and the platelet count (Kang et al, 1985; Orlikowski et al, 1996; Plotkin et al, 2008). In vitro platelet depletion studies using ROTEM have shown a strong correlation of platelet count to maximum velocity of clot formation (Sorensen & Ingerslev, 2004). Other platelet studies evaluating thrombin generation by ROTEM and thrombin/antithrombin complex formation (Rivard et al, 2005) suggests VCTs may be a good surrogate of endogenous thrombin potential i.e. overall coagulation capability. VCTs may not be able to accurately assess platelet activation and aggregation but may be a useful guide to adequacy of both platelet count and function in major trauma.

Global tests of coagulation are candidates to be new gold standards in haemostasis, particularly in the acute setting of a multi-factorial coagulopathy such as TIC. Platelet-poor plasma assays that evaluate the primitive stages of clot initiation provide only limited information on clot dynamics. VCT is a good marker of thrombin generation capable of evaluating clot formation and breakdown in its entirety. Direct comparison between TEG and ROTEM may not be possible and each device requires separate validation as a diagnostic tool in trauma haemorrhage. Whole blood assays provide more information on coagulation but cannot assess the important contribution of endothelial injury to ATC. VCTs may offer greater scope for rapid detection of ATC and TIC than conventional measures of coagulation subject to large studies to fully assess their reproducibility in trauma resuscitation.

Transfusion strategies for trauma haemorrhage

Massive transfusion and plasma ratios

Eighty percent of all traumas deaths that occur in the operating theatre are the result of haemorrhagic shock and exsanguination (Kauvar et al, 2006). Large volumes of PRBC and clotting product replacement are required to maintain oxygen carrying capacity and augment clot formation. No prospective randomized controlled trial (RCT) has compared restrictive and liberal transfusion regimens in trauma but current guidelines recommend target haemoglobin of between 70 and 90 g/l (Rossaint et al, 2010). Where possible, cell salvage should be employed to facilitate auto-transfusion but few centres are able to offer robust and timely provision of this service for trauma resuscitation. Definitions of MT vary (10+ units PRBC/24 h, 100% blood loss in 24 h, blood loss of 150 ml/h) and lack supporting clinical evidence. Rates of MT in civilian trauma practice are <5% but account for 75% of blood utilization in busy Level 1 trauma centres (Como et al, 2004).

Transfusion medicine has undergone a paradigm shift in recent years, following numerous reports of improved outcomes with early high dose plasma therapy (Stansbury et al, 2009). Massive-transfusion protocols (MTPs) were designed to address the dilutional complications resulting from large volumes of PRBCs and were therefore reactive to ATC. Until recently, most published guidelines advocated the sequential replacement of PRBCs with crystalloids prior to clotting products and platelets and only after a laboratory confirmed derangement e.g. PT >1·5 × normal, platelet count <50 × 109/l (Stainsby et al, 2006; Rossaint et al, 2010). An international multi-centre retrospective analysis of over 3000 trauma patients was unable to identify a threshold for definition at which MT specifically results in worse outcomes (Stanworth et al, 2010). Patients receiving 6–9 units of PRBCs had nearly 2·5 times the mortality of patients receiving none to 5 units. Management strategies targeted at patients receiving a threshold of 10 or more PRBC units would therefore exclude a large proportion of patients receiving fewer transfusions but who still have a significant mortality. The concept of DCR has been introduced to expeditiously address rapid exsanguination and ATC through a targeted delivery of blood components in predefined ratios (Holcomb et al, 2007). DCR describes a formulaic protocol activated on the basis of clinical need and not dependent on laboratory triggers, e.g. PT results, which delays initiation of therapy. Massive Haemorrhage Protocols (MHPs) conceptually better describe algorithms that deliver blood and clotting products immediately and simultaneously (not following a predefined number of PRBC transfusions) to achieve haemostatic control.

Computer modelling of major haemorrhage and MTPs was the first to question the rationale and efficacy of restrictive transfusion practice. Ho et al (2005) demonstrated excessive coagulation factor dilution with delayed low dose FFP that required at least 1:1 therapy to reverse the coagulopathy. Using an alternative simulation, Hirshberg et al (2003) reported optimal replacement ratios of 2:3 and proposed concurrent use of plasma with initial PRBC transfusion to prevent prolongation of clotting times. A large number of retrospective clinical studies have supported the results of these theoretical models with survival benefits seen following the adoption of aggressive transfusion strategies although the quality of evidence is limited (Murad et al, 2010).

Borgman et al (2007) were the first to show an association between the ratio of FFP:PRBCs and mortality. Casualties who received ratios of <1:4 had a 65% mortality rate compared to 19% in those who received >2:3 FFP:PRBC. Similar analyses have been replicated using both military and civilian trauma registries, the majority of which have advocated ratios approaching 1:1 (Rossaint et al, 2010). Survivor bias is a major confounding factor in these retrospective studies (Scalea et al, 2008; Snyder et al, 2009) as patients have to live long enough to receive a high ratio. Interestingly, recent work has shown maximal benefit with ratios of between 1:3 and 2:3 FFP:PRBC with no additional outcome benefit with 1:1 transfusions (Kashuk et al, 2008; Davenport et al, 2011c). Formula-driven MTPs are becoming commonplace in the management of trauma haemorrhage (Callum et al, 2009) however, in the absence of any prospective RCTs, the optimal ratio of FFP:PRBC for DCR remains unclear. The PROspective Plasma and Platelet ratio (PROPPR) study is a large multicentre RCT due to begin in North America with the aim of addressing the issues of blood product ratios in trauma ( In an era where intervention strives to be goal-directed and individualized it remains to be seen whether a universal approach to resuscitation in traumatic haemorrhage is valid.

Fresh frozen plasma

Fresh frozen plasma has been in use since the 1940s for the correction of coagulopathy associated with haemorrhage, however, the evidence for its efficacy for managing massive haemorrhage is limited (Stanworth et al, 2004). Furthermore, its use for emergency transfusion is complicated by the 10–15 min delay incurred whilst the units are thawed. Some centres have moved to ready thawed AB plasma to facilitate the rapid provision in the ED and operating theatres for exsanguinating patients requiring high dose therapy. FFP contains all the clotting factors, fibrinogen (400–900 mg/unit), physiological anticoagulants e.g. protein C, protein S etc.) and plasma proteins (O’Shaughnessy et al, 2004) but precise quantities vary as there is no standardization for individual units. Large quantities are recommended (10–20 ml/kg) to achieve an effect in major bleeding but there are no published evidence based guidelines (Stanworth et al, 2007). Newer treatment algorithms proposing FFP at ratios of 1:1 with PRBC necessitate higher than previously administered doses, which are not without risk (Gonzalez et al, 2007; Sperry et al, 2008; Watson et al, 2009). Immunomodulation and noscomial infection are all higher in transfused trauma patients (Moore et al, 1997; Sarani et al, 2008) and in one study FFP was found to be an independent risk factor for acute lung injury and multi-organ failure (Watson et al, 2009). However some authors have questioned whether a more liberal transfusion policy actually controls bleeding earlier and therefore reduces exposure to blood and blood products (Zink et al, 2009).

Increased demands both on availability and quantity of FFP are likely to impact heavily on transfusion services. Furthermore, high dose FFP therapy may theoretically potentiate ATC. The endogenous coagulopathy is putatively mediated through the Protein C pathway via the thrombin switch mechanism (Brohi et al, 2007a; Davenport et al, 2011a). Thrombin production is driven by the assembly and cleavage of clotting factors on the platelet membrane. The addition of increased clotting factors to the system via plasma transfusions may increase activated Protein C production as more thrombin is complexed to TM. The rapid uptake of newer transfusion strategies without high-level evidence should be cautioned prior to definitive aetiological studies and prospective confirmation of clear outcome benefits with high dose (1:1) plasma therapy. However, current evidence supports at least a more liberal approach to FFP transfusion with improved outcomes compared to traditional restrictive practices on FFP use.


Existing guidelines for platelet therapy in severe haemorrhage are based on the platelet count with consensus to maintain >50 × 109/l in polytrauma or >100 × 109/l in central nervous system injury (Davenport & Brohi, 2009). MT will undoubtedly cause dilutional thrombocytopenia (Hiippala, 1998) but some studies have shown that the platelet count is maintained in the early stages following severe injury (Brohi et al, 2007b; Borgman et al, 2011). Most current resuscitation algorithms advocate the initial use of PRBCs followed only by FFP and platelets in patients who have received >6 PRBC units, display evidence of ongoing bleeding and have documented derangement in clotting function (PT/APTT >1·5 × normal).

In agreement with previous reports of high dose plasma therapy in trauma, a recent study has shown similar benefits with aggressive use of platelet transfusions. Holcomb et al (2008) analysed data from 466 civilian patients admitted to 16 US trauma centres who had received a MT. Plasma and platelet to RBC ratios were independent predictors of death at 6, 24 h and 30 d. Patients with high platelet:RBC ratios (≥1:2) had significantly increased 30-d survival compared to those with low platelet:RBC ratios (59·9% vs. 40·1%). In addition, the combination of high plasma and high platelet to RBC ratios was significantly associated with increased intensive care, ventilator and hospital-free days. Gunter et al (2008) found similar results in a retrospective comparison of outcomes in 259 MT patients pre- and post-implementation of a trauma exsanguination protocol. Patients receiving platelet:RBC ratio ≥1:5 had significantly lower 30-d mortality compared to those with lower ratios (38% vs. 61%). Although convincing, the current evidence for increased ratios of plasma and platelet:RBC is all retrospective and therefore at risk of confounders. The rate of bleeding and precise temporal ratio of blood component transfusion i.e. within the first 30 min, first hour etc. may be an important determinant of outcome (Geeraedts et al, 2007; Kashuk et al, 2008; Snyder et al, 2009). In addition, the role of plasma contained within platelet transfusions was not addressed by any of these studies and may account for some of the reductions in mortality and morbidity.


Fibrinogen levels appear to be maintained during ATC (Brohi et al, 2007a) although there is controversy over the critical level for adequate haemostasis (Meyer et al, 2011). Profound acidosis, severe hypothermia (<32°C) and haemodilution have been shown to lower fibrinogen (Fries & Martini, 2010) although in the early post-injury ATC phase of TIC these factors are not significant. Studies in other surgical fields have shown increased blood loss when fibrinogen levels fall below 1·5–2 g/l (Blome et al, 2005; Charbit et al, 2007). The lack of Level 1 evidence for fibrinogen replacement parallels that of FFP ratios, with no RCTs to date. One retrospective review of 252 combat casualties has shown that in patients receiving a MT, high concentrations (≥0·2 g fibrinogen per unit of PRBC transfused) were independently associated with improved survival rates (Stinger et al, 2008).

Two observational studies have examined outcomes in trauma patients receiving fibrinogen and prothrombin complex concentrates (PCC) and guided by clot strength on ROTEM. In the first retrospective analysis of 131 patients, all whom received ≥5 unit of PRBC, observed mortality was significantly lower than that predicted by trauma scoring methodology (24% vs. 34%) (Schochl et al, 2010). It is impossible to determine the true impact of this particular haemostatic treatment algorithm as a number of confounding factors may have produced the favourable survival rate e.g. prompt coagulation assessment with ROTEM and changing transfusion practice over the duration of the 4 year study. The second study, which compared trauma patients who received FFP (≥2 units) with those treated with fibrinogen and/or PCC (no FFP), found reduced PRBC and platelet requirements in the fibrinogen-PCC group. In the patients who did not receive FFP 29% avoided any allogeneic blood products. Initiation of haemostatic therapy may be a significant confounding influence as there is no thaw-time associated with fibrinogen or PCC, which can typically be administered in <10 min. The relevance of this to UK practice is limited at present, as fibrinogen concentrate is not currently licensed for clinical use. Further in vitro study and clinical trials are warranted as any avoidance of exposure to blood products is clearly of benefit to the patient and for transfusion practice as whole.

The current guidelines for fibrinogen replacement in trauma are unclear and without good evidence, however most suggest a critical value of <1 g/l (O’Shaughnessy et al, 2004; Rossaint et al, 2010) and recommend supplementation with either fibrinogen concentrate (3–4 g) or cryoprecipitate (50 mg/kg or 15–20 units). Of note, Lang et al (2009) has shown that thrombocytopenic patients demonstrate greatly improved clot strengths with fibrinogen concentrate, which potentially could mitigate some of the problems associated with platelet transfusion (Norda et al, 2006). Further work is required to ascertain the true effect of fibrinogen replacement. Clinical studies in isolation are unlikely to answer this question as MTPs typically use platelets and FFP, both of which contain fibrinogen.

Adjuncts to massive transfusion

The immediate availability, standardized concentrations, low volume and superior safety profile with respect to pathogen transmission and transfusion-related acute lung injury make factor concentrates and other pharmacological agents an attractive option. When first used for traumatic haemorrhage, recombinant activated Factor VII (NovoSeven; Novo Nordisk, Copenhagen, Denmark) was initially hailed as a magic bullet against ATC with numerous anecdotal reports of its efficacy. However two large RCTs have failed to show any outcome benefit and only modest reductions in blood usage (Boffard et al, 2005; Hauser et al, 2010). Desmopressin has produced equivocal results in the treatment of microvascular bleeding following cardiac surgery with no studies evaluating its effect in trauma patients and therefore is not currently recommended for routine use (Rossaint et al, 2010). PCC has been shown to reduce blood product requirements in surgical patients (Bruce & Nokes, 2008) but is not currently recommended or licensed for use in trauma patients due to the potential risk of thromboembolism (Rossaint et al, 2010). Newer products containing coagulation inhibitors appear to have an improved safety profile and with dynamic monitoring of coagulation status the thrombotic risk may be reduced to acceptable levels (Sorensen et al, 2011). Further study is required to assess the real risk of a thrombotic complication versus the need for rapid and effective correction of coagulopathy in traumatic haemorrhage.

Other pharmacological agents, such as the anti-fibrinolytic, tranexamic acid (TXA), have been comprehensively shown to reduce blood loss in elective surgery (Henry et al, 2007). The recent multicentre CRASH-2 (Clinical Randomization of an Antifibrinolytic in Significant Haemorrhage) trial, which evaluated TXA in over 20 000 cases of bleeding trauma, found a significant reduction in the risk of death (relative risk 0·85) in those receiving the drug (Shakur et al, 2010). Subgroup analysis demonstrated that TXA was most effective in patients with SBP <75 mmHg as may be expected since this cohort of patients are likely to have maximal activation of fibrinolysis. The precise mechanism of action in trauma haemorrhage is unclear but intuitively it seems that timely administration of TXA for ATC would switch off hyperfibrinolysis in the early stages, augment clot formation and reduce blood requirements. A subsequent study from the CRASH-2 collaborators has shown that early treatment (≤1 h from injury) significantly reduced the risk of death due to bleeding (5·3% in TXA group vs. 7·7% in placebo group) with similar findings between 1 and 3 h but not when treatment was given >3 h after injury (Roberts et al, 2011).

Prediction of massive transfusion in trauma

Major haemorrhage in trauma is a logistic challenge, and to be effective MHPs must be activated early. Clinical parameters are generally of low predictive value whilst coagulation screens have a limited application in patients with rapid blood loss. DCR with defined MHPs is in use across many centres but the activation of such protocols is user-dependent with little standardization (Malone et al, 2006). Scoring systems have been developed in an attempt to guide the activation of MHPs and reduce wastage [McLaughlin et al, 2008; assessment of blood consumption (ABC; Nunez et al, 2009), trauma-associated severe haemorrhage (TASH; Yucel et al, 2006)]. All require laboratory results, injury severity scores or complex calculations, which restricts their predictive power in the early stages (first 15 min) of trauma resuscitation. Comparison of three such scoring systems (TASH, ABC and McLaughlin) reveal varying levels of precision with between 66% and 88% of patients correctly classified. Inaba et al (2010) have shown that inappropriate activation of MHPs may cause harm. Higher levels of acute lung injury, multi-organ failure and sepsis were recorded in patients that did not receive a MT but had received high volumes of plasma (false positives for MHP activation). Predictive models may be of use in stratifying patients who are likely to derive survival benefit from DCR and those at risk of transfusion-associated complications (Borgman et al, 2011). MT as a concept in trauma has limited utility, and emphasis should be placed on identifying patients with massive haemorrhage. The detection of ATC, rather than surrogates of haemorrhage, using a global test of coagulation may serve as the optimal trigger for initiation of MHPs.

Only a handful of hospitals use VCT to trigger MHPs in trauma (Fries et al, 2009; Stahel et al, 2009; Theusinger et al, 2009). Plasma, platelets and fibrinogen are therefore usually administered on an empirical or clinical basis (Malone et al, 2006; Hoyt et al, 2008). In combat casualties Plotkin et al (2008) found TEG may be a more sensitive test of coagulation than standard assays with better correlation to blood product use. Comparable findings with ROTEM have subsequently been reported by the British military in Afghanistan (Doran et al, 2010). The original small retrospective study of 44 patients with penetrating injury assessed TEG parameters a median of 4·5 h following admission (Plotkin et al, 2008). In the prediction of transfusion requirements TEG was superior to PT and PTT but it is difficult to translate these findings into civilian practice given the unique case mix. Potential confounders are time to surgical haemorrhage control, intravenous fluids and blood component therapy pre-TEG analysis, none of which were reported. A recent single centre study of 161 severely injured civilian patients reported a significant difference in Platelet Mapping (MAADP) values between survivors and non-survivors and those requiring transfusion with those who did not (Carroll et al, 2009). The significance of this finding is unclear as the authors failed to report the relative contribution of fibrinogen to clot strength, provide specific details of transfused blood products and surprisingly were unable to demonstrate any association of MA or MAADP with platelet count.

Recently, we have taken steps to validate ROTEM in a study of 300 trauma patients and shown that CA5 ≤35 mm has a detection rate of 77% for ATC (PTr >1·2) (Davenport et al, 2011b). Patients with CA5 ≤35 mm were significantly more likely to receive PRBC (46% vs. 17%) and FFP (37% vs. 11%) transfusions and, at 5 min, could identify patients who would require MT (detection rate of 71%, vs. 43% for PTr >1·2). Similarly Leemann et al (2010) demonstrated the prognostic value of MCF – as a single variable to predict MT it has an area under the Receiver Operating Characteristic curve of 0·824. Near-patient tests, such as ROTEM and TEG may be able to identify patients with ATC, predict the need for MT and allow early activation of a MHP. The speed of analysis, PoC capability, global assessment of coagulation and ability to detect hyperfibrinolysis are all advantageous for the diagnosis of ATC. In the absence of information on platelet function the predictive accuracy of VCT may be limited. Further prospective studies in trauma are required to clarify precisely which parameters and platelet function assays are best correlated with transfusion requirements.

Goal-directed therapy in trauma haemorrhage

A recent Health Technology Assessment recommended the intra and post-operative use of TEG/ROTEM in liver transplant and cardio-pulmonary surgery (Craig et al, 2008). A lack of robust data from RCTs prevented the authors from fully endorsing its use in other forms of surgery i.e. trauma. Johansson and Stensballe (2009) have shown survival benefits utilizing a TEG-guided transfusion algorithm for a mixed cohort of emergency patients with massive haemorrhage. The retrospective study compared 442 patients who had received >10 units PRBCs with a historical control group of MT recipients. The pre-emptive and immediate use of clotting products in tailored transfusion packs when massive haemorrhage occurred or upon arrival at the trauma centre makes it difficult to ascertain the real impact of TEG in targeted transfusion.

Rugeri et al (2007) identified various cut-off values of early ROTEM-derived clot strength as good correlates to standard triggers (e.g. PT >1·5 × normal, APTT >1·5×, fibrinogen <1 g/l). Threshold values for FFP and platelet transfusion were associated with positive predictive values <25% and deemed unreliable guides to transfusion. The authors reported high negative predictive values, indicating VCTs may be able to ‘rule out’ the need for MHPs and have a role in blood conservation strategies. Numerous centres have published ROTEM algorithms to guide transfusion therapy in trauma haemorrhage but none have been formally validated. Some European centres have derived goal-directed treatment strategies aimed at normalizing ROTEM parameters with factor concentrate therapy alone (Fries et al, 2009; Theusinger et al, 2009). The authors reported significant reductions in the amount of blood components transfused with a decrease in blood loss and overall costs.

In summary, current measures of coagulation are of limited value in the management of post-traumatic coagulopathy. Extrapolation of results from other surgical disciplines and early studies in trauma reveals a potential role for VCTs in life-threatening traumatic haemorrhage for the rapid diagnosis of ATC. However a lack of validation and standardization of sample protocols have limited the widespread uptake of VCTs in trauma resuscitation. Anti-fibrinolytic therapy has been shown to have clear benefits in major trauma haemorrhage and goal-directed transfusion using ROTEM/TEG appears possible. Haemostatic control resuscitation with early high dose FFP and other blood products is associated with improved outcomes although the optimal ratio is currently unknown and definitive prospective studies are awaited. Current evidence suggests ROTEM and TEG are of benefit in managing trauma haemorrhage and, pending further validation, may replace conventional tests of haemostasis for the diagnosis and treatment of ATC in the future.


Both RD and SK contributed to the writing of this paper.


The department have received support from Pentapharm GmbH (Munich, Germany) for ROTEM reagent and equipment on an unrestricted basis. RD has received honorarium from Novo Nordisk, Bayer Healthcare and CSL Behring for lecturing.