History of lower limb reconstruction after trauma


  • M. Wagels BMedSci, MBBS; D. Rowe MBBS, FRACS; S. Senewiratne MBBS, FRACS; D. R. Theile MBBS, MS, FRACS.
  • Michael Wagels is RACS-CONROD Trauma Fellowship recipient 2010.
  • This paper was presented at the RACS Annual Scientific Congress 2010, Perth.
  • This study was generously supported by the 2010 RACS-CONROD Trauma Fellowship and the NHMRC.


Dr Michael Wagels, Department of Plastic and Reconstructive Surgery, Princess Alexandra Hospital, Ipswich Road, Woolloongabba, QLD 4102, Australia. Email: michaelwagels@hotmail.com



The principles guiding reconstruction of the lower limb after trauma have become established over 300 years through advances in technology and studies of epidemiology. This paper reviews how these principles came about and why they are important.


This is a structured review of historical and recent literature pertinent to lower limb reconstruction. The outcomes assessed in the pre-modern era were wound mortality, amputation mortality and amputation rate. In the modern era, infection and non-union emerged as measures of outcome, which are morbidity- rather than mortality-based. Indications for amputation published during the eras are taken to reflect the reconstructive practices of the time.


Amputation and wound mortality fell throughout the pre-modern era, from 70% and 20% to 1.8% and 1.8%, respectively. Amputation rates peaked in the American Civil War (53%) but have remained less than 20% since then. Infection and non-union rates in the modern era have fluctuated between 5% and 45%.


Priority areas for research include refinement of soft tissue reconstruction, injury classification, standardization of outcome measures and primary prevention. The impact of débridement and antisepsis on outcomes should not be forgotten as progress is made.


The reconstruction of lower limb trauma can be complex. To understand contemporary problems and target areas for progress, its history must be examined. Aldea and Shaw reviewed the subject up to the Korean War.[1] Much has changed since then, particularly fracture management, soft tissue reconstruction and nerve repair.[2-7] These elements as they relate to the reconstruction of lower limb trauma have not been collated to date and their origins are unexplored. This review aims to update the history of lower limb trauma reconstruction, to look for patterns of change and to define priorities for research.


A structured literature review was undertaken using the search terms history, lower limb, reconstruction, salvage, débridement, antisepsis, vascular, fracture fixation, soft tissue and nerve. Medline, Oldmedline, Pubmed and Google were the search engines of choice. The modern era begins with the Vietnam War, after which the quality and volume of research in lower limb trauma increased considerably.

In the pre-modern era, outcome measures were amputation rate, mortality from wounding and mortality from lower limb amputation. These indices were consistently recorded throughout. Published indications for amputation inversely reflect the capacity for lower limb reconstruction. In the modern era, mortality from isolated lower limb trauma reached negligible levels and a shift from mortality- to morbidity-based outcomes evolved. The most morbid of these were infection and non-union.


Amputation and débridement

Amputation was the earliest form of lower limb reconstruction if recognized as a means to a well-fitted prosthesis. Among the pioneers of this philosophy was the French barber-surgeon Ambroise Paré in the 16th Century.[1] Despite common use, 200 years would pass before the first indications for amputation were published in Benjamin Bell's book ‘A System of Surgery’ (1796). Those indications specific to trauma were ‘bad’ compound fractures, extensive lacerations or contusions and formalizations. Bell recognized different indications for civilian and military trauma because of decreased violence and better access to care in the former, for which uncontrolled bleeding, extensive necrosis and infected non-union were also included.[8]

Sage Sushruta described débridement around 3000 bc in India. This went unrecognized in the West until after the technique was discovered there independently.[9] Pierre-Joseph Desault was the chief surgeon of the Hotel-Dieu Hospital in Paris during the French Revolution. He coined the term débridement and popularized its use in the late 18th century.[1] Dominique Jean Larrey, a student of Desault, recognized that the results of débridement were best if performed early and were limited by what could be tolerated without anaesthetic.[1]

Anaesthesia and antisepsis

In 1846, Robert Liston performed the first operation under ether anaesthesia. Fittingly, it was an amputation for tibial osteomyelitis. Ether was revolutionary because it made surgery safer, more efficacious and humane.[1] In 1864, Louis Pasteur showed that bacteria were the exogenous cause of infection.[10] Joseph Lister, a student of Liston and witness to his first amputation under ether anaesthesia, applied Pasteur's findings clinically by observing antiseptic principles with carbolic acid (phenol). In an 1867 Lancet publication, he reported a series of 11 compound tibial fractures without septic complications, which was unprecedented.[11]

Although it was unpopular in the broader scientific community, the Germans embraced antisepsis during the Franco–Prussian War (1870). Consequently, their wounds healed faster and with fewer complications. French soldiers injured towards the end of the War would surrender to the Prussians knowing that the risk of morbidity and mortality in so doing was considerably lower than being treated by their own French surgeons.[1] The antibiotic era was still 60 years away and though important, would improve outcomes by only a fraction compared with antisepsis.

Fracture fixation

Louis Léopold Ollier described the closed plaster technique for managing infected compound tibial fractures in the Franco–Prussian War.[1] He did not embrace antiseptic principles. H. Winnett Orr re-discovered the technique and coupled it with antisepsis during World War I. With Alexis Carrel, he helped to create Carrel-Dakin solution, which superseded carbolic acid.[12, 13] During the Spanish Civil War (1936–1939), Josep Trueta treated 1073 compound tibial fractures with nothing more than débridement, antisepsis and plaster immobilization. He reported mortality and morbidity of 0.5% and 8.5%, respectively; results that compare favourably with the modern era.[1]

Remarkably, attempts at fracture instrumentation pre-date the closed plaster technique. Joseph-François Malgaigne started work on the first external fixateur in 1840, although it was not until 1942 that a reliable device evolved. Still used today, it bears the name of its inventor Raoul Hoffmann.[7] Danis and Müller developed internal fixation together in the 1950s after an earlier period of haphazard innovation. Müller with Weber, Willenegger and Allgöwer then went on to found The Association for the Study of Internal Fixation (The AO) in 1958. This organization has coordinated the development of internal fixation devices ever since.[14]

Vascular repair and reconstruction

The mastery of antisepsis paved the way for complicated reconstructive surgery. John B. Murphy successfully restored arterial circulation following resection of a traumatic femoral aneurysm in 1896.[15] Alexis Carrel recorded the earliest descriptions of vascular anastomosis in 1902. He was inspired by the unsuccessful attempt to repair the portal vein of French President Sadi Carnot, assassinated by stabbing in 1894.[16] For his work, Carrel won the Nobel Prize for Medicine in 1912. Segmental vascular defects remained problematic until Erich Lexer successfully used saphenous vein graft in 1907.[17, 18] Arthur Voorhees recognized that finding donor vein for interposition grafting was not always possible – particularly in trauma – and used the first vascular prosthesis in 1952.[19]

Outcomes of the pre-modern era

Understanding the innovations of the pre-modern era helps explain how the indications for amputation evolved. The first change occurred 16 years after they were first published by Bell. In a report by Dominique Jean Larrey, extensive necrosis and contusion were removed as indications because they could now be débrided. A retained foreign body and articular trauma became new indications, probably because the extent of débridement was limited without anaesthesia and perhaps reflecting an early focus on functional outcomes, respectively.

Lockwood further updated the indications for amputation after World War I. Now, two compound fractures of the same bone and irreparable injury to major nerves or blood vessels were independent indications.[20] To these were added systemic shock by the Surgeon General's Office and the extent of soft tissue injury by Babcock.[21] Prophylactic amputations for open fractures were abandoned. These changes may reflect a decrease in infective complications as a result of antisepsis and more aggressive débridement afforded by anaesthesia. The last set of indications for the pre-modern era came from the Emergency War Surgery-NATO Handbook published jointly by French, American and British surgeons in 1958. Shock and sepsis were removed because they were now deemed treatable with blood products, vascular reconstruction and antibiotics. The handbook declared amputation as ‘surgical defeat’ and recommended it be delayed as long as possible for the patient's state of mind.

As the morbidity and mortality of amputation fell, it became the reconstructive mainstay in limb trauma. During the American Civil War, more limbs were lost than in any other American conflict before or since then (Fig. 1), but mortality improved. In the Union Army of 1865, amputation mortality averaged 40% and the wound mortality was 13.3%; a considerable improvement on the Crimean experience of 1856 (62% and 20%, respectively).[1] Sepsis remained a problem and suppuration was still regarded a normal part of wound healing.[22]

Figure 1.

Pre-modern era outcomes.

By the First World War (1914–1918), antisepsis was practiced widely and amputation mortality fell to 12.4%. Wound mortality too fell to 8%, prior to the advent of the antibiotic era.[1] Leading up to the Vietnam War (1953), ballistic technology continued to improve, but surgeons now understood the basic principles of managing lower limb trauma, and that they held true regardless of the violence of injury. With the transition to the modern era, the driving force for progress shifted from war surgery to civilian trauma. This offered a more consistent flow of cases, such that innovations could occur independent of periods of war.

Reconstruction of bone

The first bone graft was a dog-to-human xenograft performed in 1668 by Job Janszoon van Meekeren. The procedure was promptly regarded by the influential Christian community as contrary to the will of God and abandoned. In 1861, Ollier demonstrated experimentally that bone fragments could survive as a graft[6, 23]; although the first human autograft had been performed prior to this by von Walter (1820). William Macewen performed the first human allograft of bone in 1880. However, bone transfers remained unpopular right through the modern era. Religion was not entirely to blame. A lack of understanding of transplant biology produced spectacular failures, including one belonging to a surgeon named Phelps. He transferred a pedicled bone flap from a dog into the tibial defect of a boy. Animal and child remained connected by the pedicle for 2 weeks. The failed flap was removed 5 weeks later.[6]

Bone grafting finally achieved popularity after F. H. Albee published a book on the subject in 1915.[6] With increased use came recognition of its limitations, including long-segment defects of the lower limb. The void was filled when the evolution of bony reconstruction and microsurgery crossed paths in 1975, courtesy of Taylor.[24] Autotransplantation of vascularized bone was an improvement on grafting but involved functional downgrade of the donor limb by virtue of leg weakness, ankle instability and great toe contracture.[25] This problem was obviated with the introduction of distraction osteogenesis.

The procedure was first described by Alessandro Codivilla in 1905[26] but was plagued by treatment failures. By accident in 1943, Ilizarov had overcome these. His device was originally designed to treat hypertrophic non-union with compression osteosynthesis. One day, the nuts on the device were turned the wrong way such that the fracture was distracted. New bone was seen on radiograph between the distracted bone ends.[26, 27] The device was patented in 1951 but was prevented from reaching the West by the Iron Curtain until the early 1980s. Italian explorer Carlo Mauri had suffered a chronic-infected tibial non-union refractory to treatment in his home country. Ilizarov cured him of the ailment on his travels to the East.[27] Distraction osteogenesis was a revolution because it provided stable fixation and autogenous bone without donor site morbidity.

Soft tissue reconstruction

Soft tissue reconstruction emerged in 1854 when Frank Hastings Hamilton performed the first cross-leg flap. The original case report spares no detail, including the need to supplement the anorexic patient's diet with beer. Also included was the first description of surgical flap delay.[28-30] Split skin grafting as a reconstruction in its own right followed 15 years later. Although probably first described by Sushutra, it was Hamilton again who pioneered skin grafting on the American side of the Atlantic; first on the secondary defect and subsequently on the chronic leg ulcers that the cross-leg flap had been designed to treat in the first place. Both reigned as the mainstays of soft tissue reconstruction for over 100 years.

This changed when Jacobsen and Suarez used an operating microscope on an animal microvascular model in 1960.[3, 31] Microsurgery was first applied clinically in upper limb replantation. Thirteen years after the introduction of the microscope to surgery, Daniel and Taylor reported the first free tissue transfer. They transplanted a groin flap onto a compound distal tibial fracture.[32] Muscle was transferred as a free flap 3 years later by Harii.[3] Through Taylor's angiosome concept, tissue transfer options expanded. Microsurgery flourished and its impact extended beyond free tissue transfer.

Reconstruction of nerve

The first descriptions of nerve repair appeared in the 16th century.[33] Poor outcomes dissuaded surgeons from attempting the procedure until well into the 19th century. In 1873, Hueter reported epineural repair of a transected nerve. Fascicular repair was described in 1917 by Langley and Hashimoto, before the technology necessary to perform the procedure existed.[4] Segmental nerve loss was first addressed by Phillippeaux and Vulpian in a dog model in1870. The first autograft in humans was performed 8 years later as a single nerve trunk, sacrificing a large nerve elsewhere. Inspired by fascicular repair, Bunnell and Boyes introduced cable grafting in 1939 wherein several strands of sacrificable donor nerve were used to reconstruct fascicles. Sir Sydney Sunderland studied the topography of upper limb nerves in 1945, but it would take until 1972 for topographic cable grafted nerve reconstruction to emerge through Millesi.[4] Nerve grafting has changed little since then.[34]

It was hoped that vascularized nerve transfer introduced by Strange in 1947 would improve results. Prior to the microsurgical era, this technique was limited by donor nerve options.[4] Taylor and Ham introduced free vascularized nerve transfer in 1976, but the accumulated evidence to date does not favour the use of vascularized nerve other than for large nerve gaps.[35] Most subsequent nerve reconstruction research has focused on sources of donor nerve, non-nerve conduits and nerve regeneration biology. Results of nerve reconstruction in the lower limb remain mixed and are usually contingent upon factors immutable to the surgeon.[36] Terzis et al. opined that ‘at any time a landmark development in the foundation of knowledge may forever alter the approach to nerve repair’.[4] An insensate foot remains a compelling indication for lower limb amputation in the setting of trauma.[37]

Outcomes of the modern era

Wound and amputation mortality became so low in the modern era that different outcome measures were needed. In 1976, Gustilo and Anderson began grading compound tibial fractures according to infective complications.[38, 39] Byrd et al. proposed an alternative less popular classification based on wound biology the following year and introduced the idea that an early reconstruction could generate better infective and union outcomes.[40] Godina famously re-iterated this point 1 year later.[41] An evolving attitude of reconstructive urgency in lower limb trauma began to conflict with the time-honoured tenet of adequate débridement. In homage to the latter, Yaremchuk et al. emphasized the importance of wound stabilization with serial débridement prior to reconstruction regardless of the passage of time.[42]

Because a salvaged limb did not always function as well as a prosthesis, function became increasingly recognized as an independent outcome. Accordingly, the indications for amputation were replaced with a list of ‘limb salvage decision-making variables’ by Lange in the modern era.[37] In 1985, authors like Gregory et al. tried to distil this complicated list into a scoring system that could predict poor function in a reconstructed lower limb;[43] an approach that was fraught with problems. How to measure function was and remains without consensus, and scoring systems could not be validated outside the study population. Finally in 1993, Bonanni et al. concluded that scoring systems could not reliably determine the need for amputation after lower limb trauma.[44, 45]

Rehabilitation medicine became a specialty in its own right in New York in 1950 and quickly exerted influence on the decision to amputate.[1] In 1978, Friedmann noted that prolonged attempts at reconstruction were deleterious to the amputee.[46] With this, amputation shook off the mantle of ‘treatment failure’. By 2007, the first meta-analysis of long-term outcomes was published. It found little functional difference between amputees and reconstructed patients, concluding that both treatments had a role to play. Indeed, the cost benefit of amputation early in the course of treatment becomes negligible with rehabilitation and the passage of time, particularly in the elderly.[47]

Figure 2 summarizes the modern era outcomes. Between 1969 and 1979, Gustilo reported declining infection rates in three series of grade III tibial fractures;[38, 48, 49] perhaps a consequence of free soft tissue transfer. Between 1987 and 2007, another three studies were published by different authors.[42, 50, 51] The trend changed to an increase in infections, which is difficult to rationalize. It may represent the emergence of antibiotic resistant organisms or advances in motoring technology producing more severe injuries. Between these two series, the infection rate fell, which may have resulted from the influence of Godina on the timing of soft tissue cover.[41]

Figure 2.

Non-union and infection over time.

Bony union is a function of the fracture, the soft tissues[52] and treatment complications. In 1980, Enneking et al. used bone graft for long-segment tibial defects with a non-union rate of 32%.[53] In 1983, Wood used vascularized bone on the same population and observed a non-union rate of 9%.[5] The non-union rate fell again in 1987, 2 years after the introduction of distraction osteogenesis to the West. The incidence of non-union as a complication appears to have been increasing since then; perhaps a result of attempts at bony reconstruction in increasingly complicated injuries. Interpretation of modern outcome data is confounded by multiple variables, with complicated relationships that cannot be controlled.


Basic life and limb preserving surgical techniques evolved in the pre-modern era. Specific tissue reconstructions followed in the modern era. Epidemiological studies emerged more recently and dictated the timing and aggression with which principles were applied. There are several innovations that, with the benefit of hindsight, can be regarded as the pillars of lower limb reconstruction after trauma. They are débridement, antisepsis, reconstruction of the vasculature, bone, soft tissue and nerve, and various adjuncts (anaesthesia, transfusion and rehabilitation). History has shown that credit for a discovery is not always bestowed upon the discoverer. This may seem unfair, but discoveries must not merely be made; they must be introduced in an environment that is conducive. The skill of popularization is at least as valuable as the discovery itself.

Refinement of soft tissue reconstruction is emerging as an area of interest with potential to improve outcomes of lower limb trauma management. Contemporary debate centres on the choice between muscle and fasciocutaneous flaps.[54] One consideration is the perceived inability of a muscle flap to develop perfusion independent of the anatomical vascular pedicle,[55] which has implications for revision surgery.[56] Also, reconstruction of nerve awaits a major revolution.[4] Reliable restoration of sensation to the foot is likely to alter the contemporary indications for amputation. Accurate injury classification and robust measures of long-term outcome would be easier to achieve in the first instance. The former must precede the latter because it is difficult to properly analyse outcomes without accurate stratification for severity of injury. The utility of the Gustilo classification has been shown to be limited in this regard.[57]


Our reconstructive armamentarium, developed over 300 years, is impressive. Historically, progress was driven by conflict. Increasing violence of civilian trauma in the modern era has allowed progress in the absence of conflict. Through a review of this progress, priority areas for research can be defined and should include refinement of soft tissue reconstruction, injury classification, standardization of outcome measures and primary prevention. In the meantime, the pillars that have elevated the practice of lower limb reconstruction to its current position should not be forgotten. This is particularly true for débridement and antisepsis; techniques whose impact to date have been and are likely to remain greater than any other.