Prescribed burning is an important but often controversial fire-management tool in fire-prone regions of the world. Here, we explore the complex challenges of prescribing fire for multiple objectives in the eucalypt forests of southwestern Australia, which could be regarded as a model for temperate landscapes elsewhere. Prescribed fire has been used in a coordinated manner to manage fuels in Australia's eucalypt forests since the 1950s and continues to be an important tool for mitigating the impacts of unplanned wildfires on human society and on a broad range of ecosystem services. Prescribed fire is increasingly being used to manage fire regimes at the local and landscape scales to achieve biodiversity outcomes through maintenance of spatial and temporal patterns of post-fire seral stages. The prescribed burning program in southwestern eucalypt forests has been informed by a long-term program of applied research into fire behavior and fire ecology. To remain successful in the future, the prescribed burning program in this region will need to adapt to changing expectations of government and the community, emerging land-use issues, resource limitations, and a drying climate.
Fire management in biodiverse and densely populated southern Australia is a highly contested issue, sharing many features with scientific and public debates in other fire-prone regions of the world (Keeley et al. 2012; Fernandes et al. 2013; Ryan et al. 2013). In southern Australia, fire is an inevitable, natural, and vital element of the environment (Pyne 1998, 2006), and thus fire management is integral to sustainable ecosystem management (Burrows 2008; Bradstock et al. 2012). Many Australians choose to live in fire-prone, temperate regions, leading to conflict between public safety and ecological management objectives, and requiring politicians and land managers to strike a balance between ecological, economic, and social values.
In a nutshell:
Forest landscapes in southwestern Australia are prone to wildfires
It is important to manage fire to meet multiple objectives, including the protection of human society and the conservation of biodiversity
Prescribed fire is an effective tool for managing fuels to reduce the impact of unplanned fires on human communities as well as on vegetation, soils, and ecosystem services
Management of prescribed fire for multiple objectives requires successful integration of scientific knowledge and practical experience, underpinned by an organizational commitment to adaptive management
Recent major wildfire events and subsequent parliamentary, judicial, or coroner's inquiries in the states of Victoria (Teague et al. 2010) and Western Australia (Keelty 2011) have highlighted community protection as the primary goal of fire management in populated agricultural and forested landscapes of southern Australia. Fire managers are continually faced with the challenge of meeting expectations for community protection while simultaneously conserving biodiversity and preserving the natural environment. The relative importance placed on these conflicting objectives by the community and by fire managers varies somewhat with time since the previous bushfire disaster, so that fire regimes appropriate for meeting multiple objectives continue to be debated (Pyne 1998, 2006; Bradstock et al. 2012; Attiwill and Adams 2013).
Here, we explore these complex issues through a detailed study of the management of fire in the forests of southwestern Australia, which form part of one of the world's biodiversity hotpots and are also home to more than 90% of the human population of the state of Western Australia. A relatively long history of prescribed burning to achieve multiple objectives, supported by applied research into fire behavior and ecology (Abbott and Burrows 2003; Burrows 2008), makes fire management in the southwest an exemplar for fire management in southern Australia. The Western Australian experience is also relevant to fire-prone, forested landscapes in temperate environments around the world.
Biophysical environment of southwestern Australian forests
Forested lands and associated ecosystems between the cities of Perth and Albany (latitude 32–35°S) extend over an area of about 2.5 million ha; these are predominantly public lands that are managed for conservation, sustainable timber production, and water catchment protection by agencies of the Western Australian government. Forest ecosystems occur primarily on undulating land surfaces and nutrient-poor soils derived from Precambrian granite and gneiss (coarse-grained rock, typically consisting of feldspar, quartz, and mica) substrates that have undergone prolonged leaching, erosion, and deposition (Wardell-Johnson and Horwitz 1996). Dry sclerophyll forests on uplands are dominated by jarrah (Eucalyptus marginata) and marri (Corymbia calophylla), averaging around 20–30 m in height (Figure 1). Wet sclerophyll forests dominated by karri (Eucalyptus diversicolor) may reach heights of up to 85 m at maturity on more fertile sites. Forest ecosystems are a major element of the Southwest Australian Floristic Region, which includes approximately 8000 plant taxa and exhibits a very high level of endemism (> 75%) (Yates et al. 2003; Hopper and Gioia 2004).
Southwestern Australia has a Mediterranean-type climate with cool moist winters and warm dry summers (Gentilli 1989). Western parts of the jarrah forest along the Darling Range escarpment receive mean annual rainfall amounts in excess of 1200 mm; this is due to orographic uplift of moist air masses associated with winter storms arising in the Indian Ocean. Areas within about 50 km of the south coast also receive more than 1000 mm of rainfall annually, but rainfall declines rapidly with increasing distance from the coast. The eastern margin of the forest corresponds broadly with the 600-mm rainfall isohyet, although this boundary is now defined artificially by the interface with cleared agricultural lands. Typical of Mediterranean-type environments, rainfall in this region is strongly seasonal, with more than 80% of annual rainfall recorded during the six consecutive wettest months (May–October), while mean monthly rainfall averages less than 25 mm during the three driest months (December–February). Consequently, vegetation is dry enough to burn for 6–8 months of the year. Maximum temperatures regularly exceed 35°C during the summer, while winter maximum temperatures are cool (<18°C) and nights may be frosty, occasionally dropping to –5°C. For much of the year, prevailing winds are generally easterly and of moderate strength (20–30 km h−1) with southerly sea breezes extending up to 50 km inland (McCaw and Hanstrum 2003). Destructive gale force winds and widespread bushfires have resulted from periodic incursion of decaying tropical cyclones below latitude 30°S, notably in 1937 and 1978. Lightning storms provide a potential source of ignition for bushfires throughout the southwestern forests between October and March in most years.
The climate of southwestern Australia has been in a persistent drying phase since the mid-1970s, with annual rainfall reduced by as much as 20%, primarily due to a decline in autumn and early winter rainfall (Bates et al. 2008). This trend has intensified and expanded over an increasing area since 2000 and is expected to continue in the future (Indian Ocean Climate Initiative 2012), with potentially severe consequences for ecosystem health and water resources in forested catchments (Kinal and Stoneman 2012). Declining autumn rainfall has also extended the length of the high-risk bushfire period into April and occasionally to the beginning of May, with the result that opportunities for safe and effective prescribed burning in the fall may be limited to only a handful of days in some years.
Prior to European colonization of the southwest of Australia in 1828, fires ignited by lightning and by the indigenous Noongar people maintained a patchwork of vegetation at different stages of post-fire development, from recently burned to long unburned (Hallam 1975; Abbott 2003). Burning by the Noongar occurred mainly during the dry summer months, from December to March (Abbott 2003), coinciding with the peak period of lightning ignition (McCaw and Read 2012). The preference of the Noongar to burn throughout the dry summer months indicates a profound understanding of fire behavior and ecosystem responses to fire, and demonstrates that the fuel age mosaic maintained by this burning limited the impact of intense bushfires that might otherwise have threatened the safety of the Noongar and the resources on which their communities depended (Gammage 2011).
Prescribed burning for fuel management
Stand flammability and bushfire behavior depend on the structure, amount, distribution, and dryness of the fuel. Fine fuels (<6 mm diameter) in eucalypt forests are comprised predominantly of leaf litter, twigs, and bark, and can be categorized according to their vertical structure and composition (Gould et al. 2011). Bark on standing trees is also an important fuel component in eucalypt forests because it facilitates the vertical extension of flames into the forest canopy and provides a source of firebrands that can propagate spot fires ahead of the flame front, a process known as “spotting” (McCaw et al. 2012). The rate at which fuels accumulate following a fire event is determined by the density of the forest canopy and the composition of the understory shrub layer, which are strongly coupled to the amount of annual rainfall and site fertility (Burrows 1994; Sneeuwjagt and Peet 1998). Fine fuels accumulate for several decades after fire, by which time fuel loads may approach 20 metric tons ha−1 in jarrah forest and 40–50 metric tons ha−1 in karri forests that have a dense understory of woody shrubs. Rates of fine fuel accumulation may be modified by outbreaks of defoliating insects and drought that reduce canopy density, and by timber harvesting and silvicultural treatments that alter stand structure and fuel arrangement (McCaw 2011). Bark on standing trees and coarse woody fuels that are highly resistant to decomposition may continue to accumulate over much longer periods than is typical of fine fuels.
Prescribed fire has been used extensively to manage fuels in southwestern forests since the 1960s (see Figure 2; see also Panel 1). Reducing fuel loads and altering fuel structure mitigate key aspects of wildfire behavior, including the rate of spread, flame dimensions, spotting, and fireline intensity (Byram 1959; McCaw et al. 2012). The contribution of prescribed fire to mitigating the effects of extensive, high-intensity fires can be quantified in a variety of ways, using basic combustion science, well-documented case studies, analysis of fire statistics, and computer simulations (Underwood et al. 1985; Fernandes and Botelho 2003; Cheney 2010). Fuel reduction can improve the safety, efficiency, and effectiveness of fire suppression, although these effects may be subtle and difficult to quantify for fires burning during severe weather conditions and in eucalypt forest fuels older than about 5 years (McCaw 2013).
Panel 1. Origins of prescribed fire in forest management
Systematic protection of forests against fire in southwestern Australia began with the formation of the Western Australian Forests Department in 1918. Up to that time, intense bushfires often followed in the wake of unregulated exploitation of forests for timber, seriously damaging extensive tracts of forest regenerated after logging (Kessell 1920; Burrows et al. 1995). Colonial foresters were trained in Europe and viewed fire as a threat to the forest, depleting soil nutrients and organic matter and slowing the growth of trees (Kessell 1920). Foresters argued that fire prevention and suppression were entirely possible, while practitioners experienced in local conditions advocated a return to the Aboriginal practice of frequent “light” burning of the forest. By the late 1920s, fire policy and practice sought to exclude fire from young regrowth forests by installing firebreaks and burning narrow strips between protected compartments, with limited prescribed burning carried out in older regrowth and uncut forests. Extensive and damaging wildfires in 1949 and 1950 led to the realization that excluding fire from the forest ecosystem was impractical and unsustainable over the longer term (Wallace 1965). In 1954, broad-area prescribed burning to reduce fuel loads was endorsed as a fundamental component of fire management. However, limited resources and a rudimentary understanding of weather and fire behavior proved a daunting challenge to implementation. In the summer of 1961, multiple wildfire outbreaks resulting from dry lightning storms burned approximately 150000 ha of forest under severe fire weather conditions. Although the area treated with prescribed fire was limited in extent, the reduction in fire intensity and damage to the forest was clearly apparent (McArthur 1962). A Royal Commission inquiry into the fires recommended that the Forests Department make every endeavor to improve and extend the practice of controlled burning to ensure that the forests receive the maximum protection practicable consistent with silvicultural requirements (Rodger 1961). The Forests Department rapidly expanded its prescribed burning program, supported by research to develop reliable fire-behavior guides and aerial ignition techniques that enabled large areas of forest to be ignited in a day, taking maximum advantage of suitable weather conditions required for low-intensity, cost-effective, and low-risk prescribed burning (McCaw et al. 2003). The 1960s also saw the beginnings of an understanding of fire ecology, which expanded in scope and became integrated into fire-management practice in subsequent decades.
At the regional scale, application of prescribed burning over broad areas of southwestern forests since the 1960s has reduced the area burned by wildfire. Boer et al. (2009) showed a strong inverse relationship between the extent of prescribed burning and unplanned fire in a 0.93 million ha forested region of the southwest over a period of 45 years, during which the fraction of the study area burned annually by prescribed fire varied from 4% to 11%. Prescribed burning reduced the mean number, extent, and frequency–size distribution of unplanned fires; furthermore, over the period of the study, the length of time that sites remained unburned by wildfire doubled to approximately 9 years. Fuel reduction had a detectable effect on the incidence and extent of unplanned fires for up to 6 years after prescribed burning, consistent with scientific knowledge of fuel dynamics and field observations of the contribution of fuel-reduced areas to fire suppression (Burrows 1994; Cheney 2010; Gould et al. 2011). The extent to which fuels older than 6 years were spatially connected had a significant effect on the annual extent of wildfire (Boer et al. 2009).
An inverse relationship between the extent of prescribed burning and unplanned fire is also evident at the whole-of-forest scale over six decades (Figure 2). Over the period 1951 to 2012, the average annual area burned by prescribed fire and unplanned fire was ~291 000 ha (or ~11.6% of the forest region) and ~7140 ha (or ~0.3%), respectively. From 1962 to 1990, the proportion of the region burned by prescribed fire each year ranged from 7.6–18.1%, with an annual average of 12.5%. Over the same period, the proportion of forest burned each year by wildfire ranged from 0.1–1.1%, with an annual average of 0.3%. Since the 1990s, the area burned intentionally through the use of prescribed fire has been reduced, leading to longer intervals between fires and increased mean fuel ages (Figures 2 and 4). From 1991 to 2012, the area burned by prescribed fire ranged from 4.1–9.2%, averaging 6.6% annually, and this has been accompanied by an increase in the area of unplanned fire ranging from 0.2–4.7%, averaging 1.1% annually. Individual forest fires >20 000 ha were uncommon in the 1970s and 1980s but have occurred every second or third fire season since 1997.
Current fire-management policy has a prescribed burning target of 200 000 ha per year for southwestern forests, representing ~8% of the public forest estate. Based on the finding of Boer et al. (2009) that each unit area reduction in unplanned fire required about four units of prescribed fire, a prescribed burning program of this scale would be expected to reduce the area of unplanned fire by about 50 000 ha. A reduction of this scale is substantial, both in terms of the area burned by unplanned fire and the expected impact of fire, given the potential of intense summer wildfires to damage human communities and ecosystem services.
Prescribed burning for biodiversity conservation
While the use of prescribed burning to mitigate wildfire risk focuses on managing the accumulation of fuel, application of fire for biodiversity conservation focuses on managing components of the fire regime considered important for maintenance of ecosystems and selected species. These components include the interval between fires together with seasonality, intensity, scale, and patchiness of burning. Fire management for biodiversity conservation outcomes is guided by biodiversity conservation objectives operating at a range of spatial and temporal scales (Burrows unpublished). These objectives have a foundation in ecological theory and are based on knowledge gained through experiments, retrospective studies, and monitoring in the forest ecosystems of the southwest. Responses to fire have been documented for many species of flora and fauna –including threatened taxa – in these forests and associated ecosystems (Friend and Wayne 2003; van Heurck and Abbott 2003; Burrows 2008; Wittkuhn et al. 2011; Pekin et al. 2012; Burrows 2013).
Populations of many plants and animals require sufficient time between successive fires to attain reproductive maturity (Whelan et al. 2002). Plants that are obligate seeders are vulnerable when intense fires recur at short intervals because seed banks cannot be replenished. Conversely, too long an interval between fires may eliminate plant taxa that rely on fire for reproduction, particularly obligate seeders that store seed in woody capsules (serotinous taxa) and lack the capacity to maintain a store of viable seed once the parent plant dies and seed is released (Whelan et al. 2002; Keith 2012). Burrows et al. (2008) compiled a database of the post-fire regeneration requirements of some 700 species of vascular plants, representing about one-third of the known flora found in southwestern forests. From this database it was determined that approximately 97% of understory species reach flowering age within 3 years of the occurrence of fire and all species reach flowering age within 5–6 years of a fire; about 3% of species are fire-sensitive obligate seeders that have primary juvenile periods longer than 3 years. Most of these latter taxa inhabit areas that are less prone to fire because they remain moist for a longer period each summer or because surface fuels tend to be sparse and discontinuous due to low site productivity or extensive rock outcropping (Burrows 2013).
Fauna with low fecundity and poor dispersal abilities are vulnerable to large-scale, high-intensity fires that occur at short intervals (Friend and Wayne 2003). Threatened mammals and birds have attracted considerable attention in terms of species conservation status and fire ecology research, including life history studies. With small- and medium-sized mammals (< 5500 g) in particular, there is considerable evidence that loss of habitat due to land clearing for agriculture and predation by introduced species such as the red fox (Vulpes vulpes) are important causes of population declines (Christensen 1980; Kinnear et al. 1988). However, there are threatened species that have special habitat requirements with respect to post-fire seral (ie developmental) stages (Panel 2). These include mammals, birds, and amphibians that depend on fires being infrequent, as well as on older and more complex mosaics of seral stages, ranging from recently burned to unburned for many decades (Friend and Wayne 2003; Burbidge et al. 2005). Knowledge of the life histories of select threatened fauna – especially fauna that are recognized as fire-sensitive or exhibit specific fire-regime habitat requirements – can be used to understand species' responses to fire and to plan fire regimes that will best support their conservation (Burrows and Friend 1998; Friend and Wayne 2003).
Panel 2. Managing quokka habitat using prescribed fire
The quokka (Setonix brachyurus; Figure 3) is a small marsupial endemic to the southwestern part of Western Australia. While a large population persists on Rottnest Island, there is evidence that the mainland population has declined since European settlement, especially in the northern jarrah forest (Hayward et al. 2004). The species is declared threatened under the 1950 Western Australia Wildlife Conservation Act.
The conservation status of the quokka requires that particular attention be paid to protecting extant populations and managing habitat. Controlling introduced predators, especially the red fox, and the judicious use of fire are fundamental to quokka conservation. Mainland quokkas inhabit more mesic parts of the landscape, such as swamps and creeks that support dense vegetation. Fire plays an important role in protecting and maintaining quokka habitat but inappropriate fire regimes, including intense wildfires, can threaten their populations.
Fire management for quokka conservation is based on using prescribed burns to either protect healthy habitat and populations from harmful wildfires or to regenerate senescent habitat that is no longer occupied by quokkas. Habitat protection burns are carried out in spring, when upland forests are sufficiently dry to burn but when creeks and swamps – that is, quokka habitat – are too moist to burn. Burning the more flammable parts of the landscape extends some protection to quokkas and their habitat against summer wildfires. Creek-line vegetation that has died back and collapsed, usually by about 20–25 years after fire, is unsuitable habitat for quokkas. Habitat regeneration burns are then carried out in autumn, when the entire landscape, including creek systems, is dry enough to burn. The vegetation regenerates vigorously and within a few years it is again suitable for quokkas. For the next 25 years or so, mild spring burns are implemented every 5–7 years to protect riparian and other mesic ecosystems embedded in the forest. These ecosystems provide habitat for a variety of fire-sensitive organisms, including the quokka. Desired fire intensity outcomes can be achieved within a prescribed burn unit by manipulating burning conditions and the pattern of ignition (Figure 4).
To properly interpret life history attributes of plants and animals as a guide to fire management for biodiversity conservation objectives, it is necessary to understand fire regime characteristics. For example, experimental studies on small plots have shown that repeated burning at 3–4-year intervals over a 30-year period has not reduced species richness but has substantially reduced the abundance of some obligate seeding shrubs in jarrah forests (Burrows and Wardell-Johnson 2003). This finding is consistent with other studies that have reported changes to floristic composition as a result of single or repeated short-period fire intervals in Australian ecosystems where obligate-seeders are prominent (Morrison et al. 1995; Bradstock et al. 1997; Watson et al. 2009; Russell-Smith et al. 2012). However, Wittkuhn et al. (2011) concluded that occasional short (3–5-year) intervals between fires were unlikely to have a persistent effect on plant community composition in jarrah forests and their associated shrublands, based on the findings of landscape-scale studies on fire interval sequences over a 32-year period. This apparent difference in results can be explained by the fact that experimental studies tend to impose a high degree of uniformity in burning treatments at local scales that is rarely encountered in fires at landscape scales.
The primary objectives of fire management for conserving biodiversity at the landscape scale are (1) to maintain a diverse representation of ecosystem seral states and habitat conditions and (2) to protect fire-sensitive and fire-independent ecosystems and niches, including riparian zones, aquatic ecosystems, and peat wetlands. Fire-sensitive ecosystems are characterized as: those containing obligate seeder plant species with long maturation periods; those that provide critical habitat for fauna with low fecundity, low dispersal capacity, and a preference for vegetation in post-fire seral stages older than the typical fire return interval of the surrounding landscape; communities that take decades or longer to recover to their pre-fire state, such as peat wetlands (Horwitz et al. 2003); and vegetation types that have a lower likelihood of burning because they either occupy mesic habitats or have sparse amounts of ground fuels.
Strategies to achieve these landscape-scale objectives include maintaining a mosaic of fire-management units (see below) within the landscape at different times since the last fire, including recently burned and long unburned units, and units burned in different seasons. Ideally, the mosaic should include three biologically important fire regime components: (1) time since last fire, (2) fire frequency, and (3) fire season. The geographic extent of these components that are ecologically sustainable can be determined based on knowledge of the region's fire ecology and the life history characteristics of taxa occurring within the landscape (Burrows and Friend 1998; Burrows 2008). Prescribed fire plays a critical role in managing fire regimes but unplanned fires are also important and may dominate the occurrence of fire in areas subject to very high rates of deliberate ignition (such as the forest zone adjacent to the city of Perth) and in remote areas, where poor access limits the opportunity for rapid initial responses to lightning ignitions (Plucinski et al. in review).
Fire regimes in forests in southwestern Australia have been characterized in a variety of ways, including from maps showing the spatial pattern of time-since-fire (Hamilton et al. 2009) and by using landscape metrics derived from spatial statistics (Faivre et al. 2011). Recognizing patterns in the landscape is important for biodiversity conservation (Forman 1995; Wardell-Johnson and Horwitz 1996). Operational planning for fire and other management activities is undertaken at the scale of the Landscape Conservation Unit (LCU) (Figure 5; Mattiske and Havel 2002). Climate, landforms, soil types, assemblages of local flora and fauna, and disturbance regimes are similar within each LCU, which range between 103–105 ha in size. Periodic reporting of the distribution of time-since-fire provides a meaningful way of measuring progress towards both biodiversity conservation and fuel management objectives at the scale of the LCU (Figure 6), and has been adopted as a reporting protocol for a key performance indicator in Western Australia's Forest Management Plan 2004–13 (Conservation Commission of Western Australia 2004). The example in Figure 6 shows how the proportion of older fuels in the Central Jarrah LCU has increased since 2004, leading to greater potential for large wildfires. The theoretical distribution of fuel age classes under a fire-management zoning strategy, assuming an annual prescribed burning program of ~200 000 ha, is shown in Figure 7.
Each LCU is further subdivided into smaller fire-management units that are bounded by roads, tracks, or natural fuel breaks (such as extensive sand dunes) to limit fire size. Fire-management units typically vary in size from 102–103 ha and include a variety of landforms, ecosystems, and vegetation complexes (Mattiske and Havel 1998) representative of the LCU in which they occur. Management objectives for individual units include using prescribed fire to maintain a variety of habitats, seral states, and vegetation structures through time (using patchy burning), and to protect fire-sensitive and fire-independent ecosystems and niches within the unit from frequent fire and large, high-intensity wildfires. Strategies used to achieve these objectives include:
Varying the season, frequency, and interval of fire application to a unit within fire regime bounds based on knowledge of fire responses and life history characteristics of key fire regime indicator taxa (Burrows and Abbott 2003; Burrows 2008).
Implementing mostly patchy burns within the unit to maintain a mosaic of time-since-fire. Small-grained, fire-induced mosaics can be created in several ways, including (but not limited to) introducing fire into the landscape where fuel flammability differentials exist as a result of variability in fuel moisture or continuity. This will also serve to protect fire-sensitive and fire-independent ecosystems, which are usually found in less flammable parts of the landscape (Burrows et al. 2008).
Burning flammable, drier, and fire-resilient habitats at intervals ranging from frequent to infrequent, depending on life history characteristics of key taxa (Burrows and Friend 1998; Burrows 2008).
Burning less flammable habitats (eg riparian zones, some swamps, valley floors, granite outcrops) less frequently and by exploiting flammability differentials that exist across the landscape in different seasons, based on life history characteristics of key taxa (Burrows 2008).
Sporadically applying moderate-intensity fires under dry conditions to promote regeneration of many obligate and facultative seed species and their associated habitats (Keith et al. 2002; Burrows 2008).
Integrating community protection and biodiversity conservation imperatives
The management of fire in fire-prone landscapes requires that multiple objectives be met, including the protection of human communities, the environment, and biodiversity. Mitigating the impacts of unplanned fires is an essential prerequisite to managing fire for other outcomes. No single fire regime will achieve all management objectives but the experience of managing fire in southwestern Australia has shown that a sound understanding of fire ecology can provide the basis for fire regimes that achieve both biodiversity conservation and fuel management objectives. When implemented within an adaptive management framework, these regimes provide opportunities for continuous learning and better fire-management outcomes.
An important strategic issue for fire and land managers is the extent to which the perceived wildfire threat to humans, which is highly variable in both space and time, overrides biodiversity conservation objectives. The planning approach adopted in the southwest is to devise prescribed burning programs based on ecological principles, followed by a systematic risk analysis to determine the threat posed by these regimes to human communities and environmental values. The Wildfire Threat Analysis tool (Muller 1993) is a structured process for considering the threat from and response to wildfires. It provides a framework for analyzing available information on all factors contributing to the wildfire threat and allows for the evaluation of alternative responses. Fire management can then be modified where the risk or threat of wildfire, and potential damage, is deemed unacceptable.
Prescribing fire in a changing world
Implementing prescribed burning over the areal extent necessary to protect communities, the environment, and biodiversity is becoming increasingly challenging, and the annual target of burning 200 000 ha in the southwest has only been achieved twice since 2000 (Figure 2). This decline in prescribed burning has been driven by constraints on burning brought about by increased population size at the peri-urban interface, concerns from communities and some industries about poor air quality and smoke (Reisen et al. 2011), and by land-use changes, including greater fragmentation of forest areas due to bauxite mining and the establishment of even-aged regrowth stands following timber harvesting. The changing climate of southwestern Australia has also altered the timing and opportunity to undertake prescribed burning with a level of risk acceptable to government and the community (Keelty 2012). Following recent bushfire inquiries and concerns raised by communities, prescribed burn planning, risk management, and decision making have become more complex, further impeding the implementation of prescribed burns.
Because of these constraints, it is unlikely that the current target of 200 000 ha will be reached in most years. Given this and the renewed emphasis on the protection of human communities, it is prudent to consider an alternative approach for deciding which areas are to be burned as a matter of priority. A zoning approach based on values at risk has been employed in the state of Victoria (Department of Sustainability and Environment 2012); here we present an example of similar zoning for risk management and resource allocation in southwest forests:
Zone 1: Community protection zone, where fuels are maintained at <4 years old, established within a 5-km radius of towns and other human settlements.
Zone 2: Bushfire modification zone, where fuels are mostly maintained at 5–7 years over a further 20-km radius to modify bushfire behavior, reduce damage, and increase likelihood of suppression. The fuel age class distribution would take the form of a negative exponential, with most of the landscape carrying young fuels (< 4 years old) but with some parts, such as rock outcrops and riparian zones, going unburned for longer periods.
Zone 3: Biodiversity management zone, where it is feasible to maintain a diversity of fire regimes and fuel ages (particularly through mosaic or patch burning) in situations where there is negligible wildfire risk to public safety and amenities. In this zone, about one-third of the area would carry young fuels (<4 years old), one-third intermediate age fuels (4–7 years old), and the remaining third carrying older fuels.
Figure 7 is an example of zoning to manage risk and to prioritize the use of opportunities and resources for prescribed burning. Under the scenario in Figure 7, a total of ~200 000 ha would need to be burned per year to achieve the fuel age objectives in each zone. This comprises ~50 000 ha in Zone 1, ~100 000 ha in Zone 2, and ~50 000 ha in Zone 3. The resulting theoretical distribution of fuel age classes is shown in Figure 6. If it is unlikely that the 200 000 ha target will be achieved, the zoning framework identifies areas that are a priority for fuel reduction investment to afford the highest degree of community protection, being Zones 1 and 2. Zoning is a controversial fire-management strategy but deserves proper evaluation as a risk-management approach if community protection is paramount and landscape-scale prescribed burn targets cannot be met.
Looking to the future, fire management in a changing world will require greater understanding of fuel dynamics, fire behavior and ecosystem responses to fire in a warmer, drier climate. A commitment to science, practical experience, adaptive management, and flexible institutional responses provide the fundamentals for ongoing development of prescribed fire management in the southwestern forests of Australia.
We thank J Russell-Smith, R Thornton, R Sneeuwjagt, R Armstrong, and T Howard for their valuable comments on a draft manuscript. We also thank G Daniel and M Porter for mapping.