The modelled effects of differing fire management strategies on the conifer Callitris verrucosa within semi-arid mallee vegetation in Australia


R. Bradstock, Policy and Science Division, New South Wales Department of Environment and Conservation, PO Box 1967, Hurstville, New South Wales 2220, Australia (fax +61 29585 6606; e-mail


  • 1Callitris verrucosa is an obligate-seeder co-dominant within flammable semi-arid mallee shrublands in southern Australia. We sought to determine if there is an optimal management solution that provides both a significant reduction in wildfire size and maintains a viable population of C. verrucosa. Using a spatial model, the effects of alternative fire management strategies (pattern and level of prescribed burning) on populations of C. verrucosa were simulated.
  • 2Plant dynamics and fire propagation were simulated in a gridded landscape model (104 cells, c. 1 ha) incorporating topographic variation typical of dune landscapes. Fire propagation was governed by fuel age, and topographic and weather effects, incorporating potential for spot fires and the influence of wind direction. Plant dynamics were determined by known demographic parameters for C. verrucosa.
  • 3The size of unplanned fires declined significantly as a function of increasing prescribed burning level. Mean fire interval was longer at zero or 1% per year of landscape treated with prescribed fire than at higher levels of prescribed burning. Mean fire intervals were shorter under the highest probability of unplanned ignition in the flat landscape and in the dune landscape on the slopes.
  • 4The responses of C. verrucosa populations reflected trends in fire intervals as affected by prescribed burning levels/patterns and topography. The lowest population sizes resulted from either high (20% per year) or zero prescribed burning. The highest population sizes occurred consistently at an intermediate level of prescribed ignition (5% per year). Population sizes were significantly larger in dune vs. flat landscapes and under random vs. non-random prescribed ignition patterns.
  • 5Synthesis and applications. The use of prescribed fire to achieve management objectives concerning ‘wildfire control’ and conservation will involve trade-offs that are affected by landscape characteristics. Prescribed fire is predicted to achieve both a diminution of wildfire size and maintenance of C. verrucosa population in the landscape. The trade-off required to achieve these objectives concerns the appropriate level and pattern of prescribed fire (strategy).


The management of fire-prone landscapes for ecological objectives is controversial (Keeley & Fotheringham 2001; Whelan 2002; Moritz et al. 2004). While responses of some taxa and communities to fire regimes are reasonably well-known (Whelan 1995; Bond & van Wilgen 1996; Bradstock, Gill & Williams 2002), the application of such knowledge to management remains problematic. In part this is because of the difficulty in predicting the ecological outcomes of alternative fire management approaches at large spatial and temporal scales (Freckleton 2004). Insight into the effects of management actions on fire regimes, and responses of biota, remains limited.

In this study we used a spatial model to examine the effects of a range of management strategies within semi-arid shrublands typical of inland south-eastern Australia. We examined whether the prescribed burning strategies successfully achieved management objectives addressed at biodiversity conservation. Specifically, we examined the effects of differing management strategies on characteristics of individual fires, fire regimes and the population dynamics of a prominent plant species, Callitris verrucosa (A. Cunn. ex Endl.) F. Muell.

In semi-arid mallee shrubland communities in southern Australia, C. verrucosa can codominate (as a shrub or small tree) with multistemmed eucalypts (Cheal, Day & Meredith 1979; Beadle 1981; Parsons 1994). It is a serotinous obligate seeder, retaining a seed bank within woody cones, and individuals are killed if their canopy is totally scorched, unlike the eucalypts which resprout (Cheal, Day & Meredith 1979; Bradstock & Cohn 2002a,b). In general, the former are highly sensitive to variations in fire regimes because their seed bank is usually exhausted following disturbance (Lamont et al. 1991; Pausas et al. 2004). In these communities C. verrucosa is usually the only species exhibiting this life history (Bradstock & Cohn 2002a). As a potential overstorey codominant, population changes may have a strong effect on general community structure and composition. For example, a number of endangered bird species are found in mallee (Woinarski & Recher 1997) and, in particular, the malleefowl Leipoa ocellata uses relatively old, dense patches of C. verrucosa (Woinarski 1989a, b; Benshemesh 1990). As such, the status of C. verrucosa can act as an indication of the consequences of differing management activities and resultant fire regimes.

The prevailing fire regime in semi-arid mallee landscapes typically consists of a 10–20-year cycle of often large fires (c. 105 ha) resulting from lightning ignitions (Bradstock & Cohn 2002a). These fires follow years of above-average rainfall (200–500 mm mean annual rainfall; Noble & Vines 1993; Bradstock & Cohn 2002a), resulting in high fuel continuity through above-average growth of ephemeral forbs and grasses (Noble 1989; Noble & Vines 1993). For example, in central and south-western New South Wales recent major fires of this kind affecting mallee shrublands occurred in 1957–58, 1974–75 and 1984–85 (Noble & Vines 1993; Bradstock & Cohn 2002a,b), representing an average interval of 13 years. Bradstock & Cohn (2002a) concluded that while there is little evidence that the current ‘natural’ (i.e. lightning sourced) regime is detrimental to any particular group or taxa, there is ongoing concern about the effects of large fires on the long-unburnt habitat of endangered birds. They (Bradstock & Cohn 2002a) noted that there is much scope for change in mallee fire regimes by an increase in both unplanned and prescribed fires.

Changes to rates of ignition could have a variety of effects. Keane, Cary & Parsons (2003) showed that landscape fire interval was sensitive to changes in both ignition rate and fire size using simulation models of forest landscapes. Fire interval decreased with increasing rates to a stable level, beyond which further increases in ignition rate had no effect. Heydon, Friar & Pianka (2000) showed that a large increase in rate of ignition (i.e. 150%) reduced the average size of fires and pixel age (time since last fire) in a spatial simulation model of arid Australian spinifex Triodia spp. landscapes. High ignition resulted in a higher proportion of younger age classes in the simulated landscape (Heydon, Friar & Pianka 2000). The consequences of these trends will depend on the life-history characteristics of resident species.

In mallee landscapes changes in land use and human activity may cause an increase in accidental or deliberate (e.g. arson) ignition rates. Thus variations in the rates of ignition in conditions likely to yield severe fire behaviour (wildfires) may reflect the outcome of such land-use changes and the subsequent consequences of any management attempts targeted at altering them.

Prescribed fire is commonly advocated as a means of controlling fuel levels, thereby limiting the incidence, spread, intensity and final size of subsequent unplanned fires (Fernandes & Botelho 2003). Increased levels of prescribed fire may therefore be beneficial to species that are perceived or known to be sensitive to the effects of high-intensity fires. Higher vertebrates that lack shelter and/or have limited dispersal ability may be disadvantaged by such fires. Species such as L. ocellata may fit into this category (Benshemesh 1990; Woinarski & Recher 1997), although data on survival of fires is limited or absent. On the other hand, an increase in ignition in general may result in a greater incidence of short-interval fires in the landscape, leading to localized decline and loss of species such as C. verrucosa that are sensitive to changes in the fire interval (Bradstock & Cohn 2002b).

Prescribed burning options in mallee landscapes are varied, with current levels being relatively low. There is scope to increase not only the overall level of prescribed burning but also to create differing spatial patterns of burning. The levels used in this study reflected this possible range. Options included the use of tracks and trails as ignition points, broad-scale multiple ignitions in either regular or random patterns, and the targeted use of prescribed fire to restrict the spread of wildfires into long-unburnt patches of vegetation.

The chief aim of this study was to see whether there are levels and spatial patterns of prescribed ignitions and rates of unplanned ignitions that produce an optimal solution for fire management in mallee landscapes. Such a solution would require a significant reduction in size of unplanned fires coincident with severe weather, with concurrent provision of a landscape-level fire interval distribution that maintained a viable population of C. verrucosa. Topographic variations, which affect the continuity of fuel elements and the resultant probability of fire propagation, were also considered (Bradstock & Cohn 2002a).

Materials and methods

A two-dimensional cellular model (CAFÉ; Bradstock et al. 1996; Bradstock et al. 1998) was developed to simulate effects of varied ignition levels/rates, spatial ignition patterns and topographic variations on populations of C. verrucosa. The model simulates the spread of fires using a flammability parameter governing the probability of propagation between neighbouring cells. This parameter can be adjusted to reflect the effects of fuel accumulation with time since fire and weather on propagation. The model also simulates population processes of plants such as survival (during and between fires), fecundity and seed bank accumulation, seedling establishment and dispersal. The model is an occupancy type (Akçakaya & Sjören-Gulve 2000) that represents changes to populations on the basis of cells occupied (percentage of landscape). Specific changes to the model used to represent the nature of fires in mallee landscapes and life-history features of C. verrucosa are described below.

A square landscape composed of 104 square cells was used in all simulations. Notionally the cells represent an approximate scale of 1–10 ha. Fire and plant dynamics were simulated in alternative flat and dune landscapes. The latter consisted of elliptical patterns of parallel dunes, represented by cells in three classes (dune crest, dune slope and swale). A swale is the depression at the base of the slope. The relative width and spacing of dunes as represented by the arrangement of these differing categories of cells was similar to that found in the field in mallee landscapes (R. A. Bradstock, M. Bedward & J. S. Cohn, unpublished data).

modelling of fire management strategies

Two non-random approaches to prescribed burning were contrasted with a random approach. The latter involved selection of ignition points at random and the subsequent spread of fire to occur up to a predetermined limit (Fig. 1a). The non-random approaches were based on the creation of two perpendicular lines of cells (one cell width) bisecting the modelled landscape, intended to represent access trails. The first approach (NRA) used the lines as an ignition point but allowed fires to spread freely from these lines (Fig. 1b). The second non-random approach (NRB) involved the creation of buffer strips containing low fuels by the concentration of prescribed burning in segments of contiguous cells within these lines (Fig. 1c).

Figure 1.

Differing spatial patterns of prescribed ignition: (a) random ignition source; (b) non-random (NRA) ignition source located on an internal grid (shaded cells) with fires allowed to spread randomly; (c) non-random (NRB) ignition source located on an internal grid with fires confined to cells within the grid.

A range of differing levels (percentage of landscape targeted for burning each year) was specified for each prescribed burning strategy (Table 1). Fires lit under both the random and the NRA strategy were allowed to spread to size limits specified for each level. In some cases restrictions imposed by flammability/terrain interactions extinguished these fires (see Modelling of fire spread).

Table 1.  Ranges of input variables used in landscape simulations of the effects of fire regimes on Callitris verrucosa: (a) chance of single ignition capable of burning 100% of landscape; (b, c) target burn area each year achieved by discrete ignitions, each 100 cells (1% of landscape); (d) area achieved by burning segments of 200 target cells contained within fixed buffer strips (see text). All burning targets were dependent on the availability of flammable cells
Fire typeFire patternAnnual burning rate*/level†
(a) Unplanned 0·05, 0·10, 0·20 ignitions*
(b) PrescribedRandom0%, 1%, 5%, 10%, 20% landscape†
(c) PrescribedNRA0%, 1%, 5%, 10%, 20% landscape†
(d) PrescribedNRB0%, 1%, 2% landscape†

modelling of fire spread

Fire propagation was modelled in a two-stage probabilistic manner by including a provision for setting the direction and shape of fires and for generation of spot fires, as dictated by the effects of wind. This enabled fire propagation to overcome barriers created by fuel discontinuity. Initially, the chance of a cell receiving fire from an ignited donor cell was estimated as a function of distance and direction of ignited donors. The status and condition of the cell (e.g. topographic position, time since fire) then determined its chance of ignition. The first step allowed the effect of wind and spotting on fire propagation to be varied according to weather conditions. The second step allowed the internal condition of the cell, in interaction with weather, to influence its flammability.

A template of potential donor cells that could transfer fire to a neighbouring target cell was defined (Fig. 2). Each donor cell within this template was assigned a probability of transmission according to its position in the neighbourhood and the weather conditions coincident with the fire (i.e. severe weather for unplanned fires, moderate weather for prescribed fires). Spot fires resulted from a template of cells wider than that of the eight contiguous neighbours of the recipient. Thus the template for unplanned fires was composed of a larger radius of potential donor cells (i.e. allowing potential spotting; Fig. 2a) than that for prescribed fires (Fig. 2b,c).

Figure 2.

Flammability templates used to simulate the influence of neighbourhood effects on intercellular propagation of fires. The probability (% chance) of propagation from burning donor cells (light shading) to a neighbouring recipient cell (black shading, centre cell in each diagram) is illustrated. Probabilities for donor cells not contiguous with the recipient cell represent potential fire propagation through spotting (a). Effects of different ignition sources/patterns and weather conditions: (a) unplanned (random), (b) prescribed (random), (c) prescribed (non-random, NRA).

The direction and shape of fires was determined by the values for probability of transmission assigned within each template according to the type of ignition. For unplanned fires in particular (Fig. 2a), the template produced initial fire spread patterns that were elliptical in shape, moving from left to right (west to east) within the landscape. Templates for prescribed fire produced fires more irregular in shape and direction. For the random strategy (Fig. 2b) the template was symmetrical, producing an even probability of fire spread in any direction. Irregular fire shapes arose solely through the influence of the condition of cells on flammability. For the NRA strategy the template was varied to allow fires to spread in a diagonal manner relative to the lines of cells representing trails and ignition points (Fig. 2c). Such burn patterns potentially created low-fuel strips that cut across the potential direction of spread of unplanned fires (see above). Prescribed fires lit under NRB strategies were confined to predetermined segments of cells. Again, in certain instances, some fires of this type did not burn all cells in the target segment because of flammability constraints.

In general, cell flammability was specified to increase with time since last fire towards an asymptote (Fig. 3) consistent with known patterns of litter accumulation (Bradstock 1990). Effects of topography on fire spread were achieved by specifying a separate flammability schedule for discrete topographic classes (flat, dune crest, dune slope and swale). Differences in flammability between topographic classes were selected to represent known differences in the density of eucalypts (the primary source of litter fuel) and the grass Triodia scariosa (Bradstock 1989; Bradstock & Gill 1993; Bradstock & Cohn 2002a). Dune slopes and flats were the most flammable element because of relatively high densities of eucalypts and T. scariosa. The flammability of swales and dune crests was represented as relatively low because of a lower density of perennial grasses and eucalypts. A separate flammability schedule was prepared for swales to represent the effects of above-average rainfall and consequent herbage growth. This raised the overall level of flammability of swales closer to that of dune slopes until burning occurred (Fig. 3). Annual probability of occurrence of such conditions was fixed at 0·05 in accordance with the frequencies postulated by Noble & Vines (1993).

Figure 3.

Schedules of flammability, reflecting the influence of weather and fuel condition (time since last fire) on the probability of intercellular propagation of fires. Values (% probability) represent the chance of a cell igniting once a fire has been transferred from a neighbouring donor cell (see Fig. 2). Effects of topography and weather associated with alternative ignition sources (a) prescribed and (b) unplanned are indicated by differing symbols: flat and dune slope (before/after rain, squares); dune crest (before/after, triangles) and swale (before, triangles); swale (after, diamonds).

Flammability schedules were also varied to account for the effects of weather on fire spread, in interaction with fuel age and site effects (Fig. 3). Schedules for moderate (prescribed fire) and severe weather (unplanned fire) were specified with lower time since fire thresholds for fire spread in the latter case. Fire size limits also varied between ignition/weather types. Unplanned (severe weather) fires were unrestricted in size, with the potential to burn the whole landscape given the availability of fuel. In contrast, prescribed (moderate weather) fires were restricted to a maximum size of 100 cells per ignition (1% of the landscape), depending on fuel availability. Unplanned fires were ignited randomly in space.

modelling of plant dynamics

Schedules of survival, maturation and fecundity used to model C. verrucosa populations are given in Table 2 (Bradstock & Cohn 2002b). While C. verrucosa does not possess any capacity for vegetative recovery following fire, bark thickness at the base of stems and the basal height of the foliage are sufficient to allow some older plants to survive particular fires (Bradstock & Cohn 2002b). Thus a multistage schedule of fire survival was specified. Seeds of Callitris species, while winged, are relatively heavy (Bowman & Harris 1995) and dispersal is known to be over a short range (< 50 m) in species such as Callitris glaucophylla (Lacey 1973). Dispersal in the model was therefore confined to a single cell radius, as was the case for other shrub species simulated in earlier applications of the model (Bradstock et al. 1996, 1998). A strong level of serotiny is exhibited in C. verrucosa and establishment appears to be confined to the immediate post-fire period (Bradstock & Cohn 2002b), in a similar manner to that found for cohabiting mallee eucalypts (Wellington & Noble 1985). Therefore, in the model, seeds only became available for germination in the year immediately following fire, provided any given cell was occupied by a mature adult or else seed dispersal had occurred from a neighbouring source.

Table 2.  Demographic characteristics of Callitris verrucosa used to model population responses to fire regimes
Initial population size5000 cells occupied (50% landscape)
Initial population age5–15 years
Survivorship probability0–200 years, 100%; > 200 years, 0%
Fecundity probability0–12 years, 0%; > 12 years, 100%
Establishment probability100%
Seed dispersal probability20%
Fire mortality (prescribed)0–49 years, 100%, 50–99 years, reduces by 1% each year; > 99 years, 50%
Fire mortality (unplanned)0–49 years, 100%, 50–90 years, reduces by 20% each decade; > 90 years, 10%


Combinations of unplanned ignition rate, prescribed fire level and pattern were simulated for both flat and dune field landscapes. Simulation duration was 1000 years, with initial occupancy (population size) of C. verrucosa set at 50% of the cells in the landscape. The initial population was evenly spread across a range of age classes and the initial time since fire distribution in the landscape was similarly configured. Ten replicates of all treatments were simulated.

While data on fire size (number of continuous cells) and population size (number of cells occupied) were compiled from the whole matrix, fire interval data were collected (years elapsed between fires) from a point source (from an individual cell at or near the centre of the landscape but distant from cells subjected to non-random prescribed burning; Fig. 1). All these data were compiled for each topographic position (flat, swale, dune slope, dune crest).

data analyses

The effects of different treatments (alternative combinations of unplanned ignition rate, level and spatial pattern of prescribed ignitions) on size of unplanned fires, fire interval and size of C. verrucosa populations (occupancy of cells in landscape) were examined (Table 3).

Table 3.  Summary of statistical analyses (see the Materials and methods). Means are arranged in ascending order and italic indicates no significant difference between the means. Factors include strategy (s, random, NRA), prescribed burning level (p, 0%,1%, 5%,10%, 20% per year), unplanned ignition rate (up, 0·05, 0·1, 0·2 per year), topography (t), flat landscape (f), dune landscape (d). Heteroscedastic data were square-root transformed†or assessed at P < 0·01‡. Probability levels (P) < 0·05*, < 0·01**, < 0·001***, NS, not significant
Response variables TestFactorsSignificant resultsF-ratio (d.f.) or tP
1Unplanned fire size      
aRandom and NRAanova[s + p + up + t]p20%, 10% < 5% < 1%, 0% 54·7 (4533)***
bNRBanova[p + up + t]NS
cNRB vs. random/NRAt-testp = 0% 0·25 (178)NS
dNRB vs. random/NRAt-testp = 1%−0·90 (178)NS
2Fire interval      
aRandom and NRAanova[s + p + up + t]s.p.up.tSee Fig. 5 2·1 (24,3757)**
bNRBanova[p + up + t]up0·2 < 0·1 < 0·05 32·4 (2321)***
3Callitris population      
aRandom and NRAanova[s + p + up + t]s.t.pSee Fig. 6 62·5 (4300)***
    s.p.upSee Fig. 6 2·5 (8300)**
    t.p.upSee Fig. 6 14·7 (8300)***
bNRBanova[p + up + t]up.t0·05d, 0·05f < 0·1f, 0·2f,0·1d < 0·2d 33·1 (2,72)***

Because the levels of prescribed ignitions were not orthogonal for all strategies (Table 1), separate analyses were undertaken (Table 3). Four-factor analyses of variance (anova) examined the effect of unplanned ignition rate, topography, level and spatial pattern of prescribed ignitions (random vs. NRA) on each dependent variable (fire size, fire interval and population size). Separate three-factor anovas examined the effects of the first three factors on the same dependent variables under the NRB strategy. All factors were regarded as fixed, allowing testing of significance of all higher order interactions. Tukey tests were used for post-hoc comparisons. To eliminate heterogeneity of variance, as indicated by Cochran's test, data were either square-root transformed prior to analysis or results were assessed at a more conservative probability (P < 0·01; Underwood 1981). The two-sample t-test compared unplanned fire size of the pooled data for random and NRA strategies with NRB strategy at 0% and 1% prescribed burning levels.

The last 10 fire intervals and fire sizes recorded in a single simulation (1000 years), representative of each treatment combination, were used for data analysis. The final population size recorded for 10 simulations of each treatment combination were used for data analysis.


effects of management strategies on fire size

The mean size of unplanned fires declined significantly as a function of increasing prescribed burning level (Table 3 and Fig. 4). Intermediate to high levels of prescribed fire (10–20% per year) reduced the mean size of unplanned fires to 20–30% of that achieved under low or nil levels of prescribed fire (Fig. 4). Other factors (topography, prescribed fire strategy and unplanned fire ignition rate) had no significant effect on mean size of unplanned fires (Table 3).

Figure 4.

Simulated mean size (± SEM) of unplanned fires (coincident with severe weather) as a function of level of prescribed fire. Results are for random and non-random (NRA) prescribed fire strategies that are were (diamonds) and non-random (NRB) (squares).

Under the NRB (buffer) strategy, there were no significant effects of unplanned ignition rate, prescribed fire level or topography on mean unplanned fire size (Table 3 and Fig. 4). Unplanned fire size under the NRB strategy was not significantly different to that under either random or NRA strategies at equivalent prescribed burning levels (0–1%; Table 3).

effects of management strategies on fire interval

The mean fire interval was significantly affected (Table 3 and Fig. 5) by the interaction of unplanned ignition rate, topography, prescribed fire level and pattern (random vs. NRA). In general, mean fire interval tended to be longer at zero or 1% per year prescribed fire than at higher levels. Mean fire intervals were generally shorter under the highest probability of unplanned ignition, particularly at low levels of prescribed ignition. Mean intervals tended to be shorter in the flat landscape and on dune slopes than on dune crests and swales.

Figure 5.

Simulated mean interval between fires (± SEM; n= 10, see text), as a function of level of prescribed fire, within a cell near the centre of the landscape grid for each topographic position. Results are for (a) random, (b) NRA and (c) non-random NRB prescribed fire strategies and differing annual probabilities of unplanned ignition: 0·05 (diamonds), 0·1 (squares) and 0·2 (triangles).

There were complex interactive effects of topography and prescribed burning pattern on mean fire interval. For example, in the flat landscape, mean fire interval was significantly longer at low levels of prescribed and unplanned ignition under random prescribed burning compared with the NRA strategy (Fig. 5). Mean intervals at low levels of prescribed ignition tended to be higher in the dune landscape, particularly at the lowest rate of unplanned fire, than in the flat landscape under random prescribed burning. In contrast, under the NRA prescribed burning strategy such effects were not evident.

Under the NRB strategy, probability of unplanned fire significantly affected the mean interval (Table 3 and Fig. 5). Mean interval decreased with increasing probability of unplanned fire.

effect of management strategies on populations

The response of populations of C. verrucosa was significantly affected by interactions between rate of unplanned ignitions, topography, levels and spatial pattern of prescribed burning (i.e. random vs. NRA; Table 3 and Fig. 6). In general, final population size (percentage of cells occupied) followed a complex trend in relation to the level of prescribed burning. Lowest population sizes usually resulted from either high (20% per year) or zero prescribed burning. Highest population sizes occurred consistently at an intermediate level of prescribed ignition (5% per year). Over the simulation period of 1000 years, most combinations of unplanned ignition rate, level and pattern of prescribed fire and topography resulted in a net decline in populations. The exceptions occurred across a broad range of random prescribed burning levels (1%, 5% and 10% per year) in both topographic types.

Figure 6.

Simulated responses of populations of Callitris verrucosa to variations in prescribed fire patterns/levels, probabilities of unplanned ignitions and topography. Results are final population sizes (% mean + SEM cells occupied) after 1000 years with a starting occupancy of 50% cells in the landscape grid (dotted line).

In general, final population sizes were significantly larger in dune vs. flat landscapes (Fig. 6a,b) and under random vs. NRA prescribed ignition patterns (Fig. 6c,d). The major exception to this trend was the random–flat combination, which resulted in the highest relative final population size at the lowest level of prescribed burning (Fig. 6e). While there were significant effects of rate of unplanned ignitions, final population size was less sensitive to this factor than topography and spatial pattern of prescribed fire. Effect of unplanned ignitions was greatest at low levels of prescribed fire.

Overall, there was a substantial decline in population size in all NRB treatment combinations (Fig. 6f). Analyses of final population sizes indicated significant effects of topography and unplanned ignition rate (Table 3). In particular, final population size was significantly lower, at high rates of unplanned ignition in flat sites, compared with dune sites (Fig. 6f).


fire regimes and population responses

Models of woody obligate seeder populations (Burgman & Lamont 1992; Bradstock et al. 1998; McCarthy, Possingham & Gill 2001; Groeneveld et al. 2002) indicate that a decadal scale interval between fire is needed to minimize risk of extinction, with an optimum interval that is generally intermediate in length relative to the life span. Across landscapes such species may be tolerant of some variation around this optimum, given inherent abilities to disperse and recolonize patches rendered temporarily unfavourable by adverse fire intervals (Bradstock et al. 1998; McCarthy, Possingham & Gill 2001; Groeneveld et al. 2002). A similar conclusion is evident in this study, although C. verrucosa populations were likely to be highly sensitive to prescribed burning levels/patterns and topography (Fig. 6). The complex relationship between final population size and these factors, in turn, reflects the influence of the fire interval as measured at an indicative central point in the landscape (Fig. 5).

Extremes of mean fire interval were related to the decline of C. verrucosa populations, reflecting contrasting mechanisms: senescence and elimination of juveniles because of the occurrence of long and short between-fire intervals, respectively. Extremes in fire interval occurred under extremes of unplanned ignition rate/level of prescribed fire, with ignition pattern and topography acting essentially as a secondary influence on this basic trend (Fig. 5).

Long mean intervals (> 50 years) resulting from either nil or low (1% per year) prescribed burning and low rates of unplanned ignition (0·05 per year; Fig. 5) were influenced by the occurrence of individual intervals > 100 years (Fig. 7). In particular intervals > 200 years were absent under high rates of unplanned ignition and levels of prescribed burning. Such intervals exceeded the specified life span of C. verrucosa, resulting in elimination of plants from affected cells. The probability of long intervals of this kind at a central point in the landscape is indicative of the wider probability of such intervals throughout the landscape grid (R. A. Bradstock, M. Bedward & J. S. Cohn, unpublished results).

Figure 7.

Frequency distributions of categories of fire intervals [0–15, 16–50, 51–100, 101–200, > 200 years (yrs)] occurring within a cell near the centre of the landscape grid for random, non-random (NRA) prescribed fire strategies and differing annual probabilities of unplanned ignition. Data for landscapes with differing topographic classes have been pooled.

Relatively short mean intervals (< 40 years; Fig. 5) were recorded under high levels of prescribed burning and high rates of unplanned ignition (Fig. 7). Such mean intervals were strongly influenced by a substantial component of short intervals (≤ 15 years) as found at a prescribed ignition level of 10% or 20% per year Such intervals rendered C. verrucosa prone to elimination given the specified age of maturation (13 years). Stable or increasing populations (i.e. final population size ≥ 5000 cells occupied) corresponded with mean fire intervals in the 20–40-year range under random prescribed burning in the 1–10% per year range (Fig. 5). Such general scenarios of mean interval were underpinned by a dominance of intervals in the 16–100-year range (Fig. 7).

Mean fire interval in the dune landscape tended to be longer than in the flat landscape for all levels of prescribed ignition (Fig. 5). This may have been because of the greater patchiness of fires as a function of lower flammability on dune crests and swales (Fig. 3), even though there was no significant effect of topography on overall mean size of unplanned fires (Table 3 and Fig. 4). The tendency for longer intervals in dune landscapes was particularly favourable to C. verrucosa at intermediate to high levels of prescribed burning (5–20% per year) as it reduced the chance of an interval shorter than the maturation period (R. A. Bradstock, M. Bedward & J. S. Cohn, unpublished data).

Relationships between ignition rates/levels, topography and mean intervals (Fig. 5) parallel those produced by fire simulation models of forest landscapes (Keane, Cary & Parsons 2003). For example, small inputs of prescribed burning initially produced a relatively large decline in mean fire interval (Fig. 5) but the mean interval tended to stabilize under further increases in prescribed burning. Similarly an increase in unplanned ignition rates palpably decreased the mean fire interval only under zero or low levels of prescribed fire. The non-linear relationship between fire interval and ignition rate exhibited in both forest and shrub–grass landscape simulation models reflects complex interactions of key processes such as weather and fuel in limiting fire spread and size. Differing spatial scales of limitation are likely in differing domains of these variables. In particular, the stasis in mean interval achieved under relatively high levels of burning is a probably a result of high levels of discontinuity of fuel condition (beneath the threshold sufficient for fire propagation) at relatively fine scales.

The NRA prescribed fire strategy resulted in lower C. verrucosa population sizes compared with the random strategy (Fig. 6). The NRA strategy generally produced shorter mean fire intervals than the random strategy in the dune landscape in particular. Intervals ≤ 15 years were more prevalent on dune slopes at most levels of prescribed burning under the NRA strategy compared with the random strategy (Fig. 7). An explanation for this may be that the NRA strategy resulted in a greater concentration of prescribed fires towards the centre of the landscape, resulting in a higher chance of short intervals on dune slopes in particular (i.e. the most flammable landscape component; Fig. 3).

The NRB strategy generally resulted in lower population sizes than the other strategies at approximately equivalent levels of prescribed fire (i.e. 1% per year; Fig. 6). An exception was the dune landscape at the highest rate of unplanned fire, when population size was similar to that achieved under other strategies at comparable rates of ignition. In general the NRB strategy resulted in longer mean intervals relative to random prescribed fire (Fig. 5), possibly because of the inherent clumping of prescribed fires in buffer strips in the centre of the landscape. Thus much of the landscape grid beyond the centre may have experienced less fire relative to the random treatment, resulting in local senescence and elimination of the species from these cells.

consequences of alternative management strategies

The model indicates that use of prescribed fire to achieve management objectives concerning ‘wildfire control’ and conservation is likely to involve trade-offs strongly affected by landscape characteristics. Clearly, given the properties of landscape flammability/structure and plant dynamics represented in these simulations, use of prescribed fire is predicted to achieve both a diminution of wildfire size and maintenance of a C. verrucosa population. Neither objective may be met in the absence of prescribed fire, hence the trade-off required to achieve these objectives concerns the appropriate level and pattern of prescribed fire (strategy).

An intermediate level of prescribed fire (5% per year, random and NRA) roughly halved the predicted mean size of unplanned fires whilst maintaining a predicted fire regime that maximized C. verrucosa population size. High levels of prescribed fire (10–20% per year) further reduced the size of unplanned fires, but the resultant regimes were generally unfavourable for C. verrucosa (Fig. 6). Prescribed fire in a range (1–10% per year) therefore offers the most scope for an effective trade-off between objectives. In particular, prescribed burning in the lower part of this range (< 5% per year) is predicted to be required in flat landscapes and/or if a non-random pattern of burning is employed. Higher levels of prescribed burning (5–10% per year) may offer the best trade-off in dune landscapes, particularly using a random pattern of ignition (Fig. 6). Therefore a key outcome of this modelling study is the prediction that management approaches need to be tailored to suit the nature of specific landscapes. In particular, no single strategy of prescribed burning will suit all circumstances as differences in combinations of levels and patterns are predicted to have varying outcomes that in part are influenced by the nature of landscapes (e.g. topography).

We emphasize that the intent of this modelling exercise was exploratory. Currently strategic use of prescribed fire to create buffer strips adjacent to roads and tracks in order to aid wildfire suppression is being employed in some mallee areas in south-eastern Australia (Bradstock & Cohn 2002a). Such an approach has been made possible by the relatively recent development of trail systems following unplanned fires in the last 30 years. The model predicts that such an approach, represented by the NRB strategy, may be suitable for dune landscapes given current rates of wildfire incidence (twice every 20 years) but may not be suitable in flat, sand-sheet landscapes. A cautious conclusion, suggested by the model, is that there is scope for management to expand the use of prescribed fire through use of differing strategies (e.g. random or NRA types) at levels suggested above.

A key constraint of the model is landscape size, relative to maximum fire size (i.e. simulated wildfires can potentially burn 100% of the landscape). Many landscapes containing relatively natural mallee vegetation are much larger than the notional area represented by the model (104–105 ha). Such landscapes are commonly composed of a fine-scale mix of differing vegetation associations and landforms reflecting variations in the depth of surface sand masses. Are model results likely to lead to adverse consequences if applied to management of real landscapes of this kind? These large mallee landscapes are unlikely to be burnt completely by one fire or two successive fires at short intervals. None the less, fires of about 105 ha can occur regularly in such landscapes (Bradstock & Cohn 2002a), so the potential exists for negative effects of short fire intervals on C. verrucosa populations to occur over extensive areas should large-scale fires overlap. Predictions from the model are probably conservative under this scenario because recolonization from sources beyond the affected area may be possible, although rates may be slow if the affected area is large.

Further constraints of the model that may require investigation are the consequences of edaphic variation for C. verrucosa demography and potential feedback effects of stand age/structure on flammability. Limited evidence suggests that plant survival may vary as a function of topographic position and that old stands (c. 80 years) may be less likely to burn than younger stands of C. verrucosa (Bradstock & Cohn 2002b). Incorporation of these effects into modelling may have additional consequences for fire regimes and plant dynamics because they are likely to change flammable connectivity in the landscape grid.

Ideally, management predictions based on modelling of this kind must be implemented with appropriate monitoring of both resultant fire regimes and plant population trends. This study highlighted the importance of the distribution of fire intervals as a determinant of population status. Such distributions may form the basis of an appropriate long-term monitoring programme that will not only serve to validate implied ecological responses but also provide information on fire regimes that can be used in ongoing decision making about fire.


We thank Jeremy Little for assistance with computing and analyses. Mark Tozer provided valuable comments on a draft of the paper.