Consequences of late-holocene climate for northern tree abundances
Explanations for the increased presence of Betula and decreased Pinus biomass observed and simulated at Penningholmen during the Medieval Warm Period (MWP) remain somewhat uncertain. Briffa et al.’s (1992)Pinus tree-ring data indicate that this species should have increased biomass during the MWP, yet an observable increase in Pinus pollen percent at Penningholmen (Fig. 3) is very weak, if at all. The low Pinus biomass during the MWP is also simulated by FORSKA2 (Fig. 3), perhaps because simulation of a disturbance event coincides with a time of climate warming.
Betula is not surprisingly unaffected by falling temperatures during the coldest part of the LIA. Betula populations are both simulated and observed to remain high from the period starting ad 1650, through to ad 1800 (Fig. 3). The Scandinavian tree-line contains a mix of Picea, Pinus and Betula, with mountain birch being the northernmost woodland species. Although maximum cooling during the LIA may have caused northern tree-line retreat (Kullman & Engelmark 1990), Penningholmen lies a vertical distance of approximately 300 m away from the Scandinavian forest-tundra ecotone, and Betula abundance was therefore unlikely to have been significantly altered.
Penningholmen, however, lies quite close to the north-western range limits of Picea abies and Pinus sylvestris. Simulated and observed declines in Pinus and Picea during the coldest period of the LIA (Fig. 3) therefore, may be related to an eastward contraction of the Pinus-Picea species limit. Kullman (1987, 1988) reports a 30 m altitudinal recession of Pinus at its northerly limit in the central Scandes mountains over a period spanning ad 1200 to ad 1700. Inadequate conifer reproduction due to cold LIA summer temperatures and episodes of severe winter cold were cited as possible causes for the premature dieback of conifers at high elevations (Kullman 1987).
The unusual dieback of Betula observed during the first half of the LIA was postulated by Bradshaw & Zackrisson (1990) to be related to insect herbivory because peak values of chitinous fragments in sediments were found to match with temporary collapses in Betula populations. FORSKA2, however, also simulates a dieback (Fig. 3), suggesting that the effects of LIA climate on northern woodland succession (particularly on the competitive interactions between Pinus and Betula) may contribute to the latter’s decline.
The striking similarity between successional trends simulated by FORSKA2 and those observed in the Penningholmen pollen record (Fig. 3) leads to the conclusion that the use of forest succession models in connection with pollen records can be extremely useful in separating the confounding influences of anthropogenic and non-anthropogenic (i.e. climate and disturbance) factors on vegetation dynamics, particularly when their effects are in the same direction (i.e. increasing or decreasing tree abundances).
Decline of tilia and rise of fagus in the south
Much debate focuses on the origins of beech forests in northern Europe, primarily owing to the current dominance by this species of forest stands across large areas, but also because beech forests are valued historical landscapes in many countries (Rackham 1980, 1997; Brunet 1995; Kuster 1997; Peters 1997). A widely held opinion in north-west Europe is that beech forests are ‘natural’ components of the landscape, a notion reinforced by papers citing historical documents describing the existence of beech forests 300–400 years ago (Brunet 1995; Björkman 1997).
On the continental-scale, Fagus sylvatica is arguably in equilibrium with climate (Huntley et al. 1989), and thus is an expected component of modern southern Scandinavian forests. Palaeoecological investigation of sediments in small forest hollows in southern Sweden (Björkman & Bradshaw 1996; Björse & Bradshaw 1998), however, indicate that from a palaeoecological perspective (i.e. over time-scales of millennia), Fagus should not be a dominant of contemporary nemoral forest. Furthermore, small forest hollows in present-day Fagus stands show signs of anthropogenic interference prior to the rise of Fagus dominance (Hannon et al. 2000). Controversy therefore concerns not so much whether Fagus should be present on a continental-scale, but rather at the stand-level, and whether its overwhelming abundance in southern Sweden and Denmark has a non-anthropogenic origin (Chambers 1993; Kuster 1997; Bradshaw & Holmqvist 1999). If modern Fagus stands are not in equilibrium with climate, then monospecific Fagus woodlands across north Europe are a distinctly anthropogenic feature.
Our simulations do not support the hypothesis that historical changes in climate were favourable for stand-scale dominance of beech, and indicate that following cooling during the LIA, Tilia tree populations should have rapidly re-established their dominance in nemoral woodlands (Fig. 4). Fagus and Tilia are physiologically similar in the sense that they are resource competitors in these woodlands (Peters 1997). Fagus is however, more cold-sensitive, because the much greater sprouting potential of Tilia allows it to reproduce vegetatively and thus persist on sites despite the colder climate (Prentice & Helmisaari 1991). Examples where Tilia individuals have remained in areas with unfavourable climate for up to several centuries have been noted for northern Europe (Pigott & Huntley 1981).
Seed production in Fagus, however, is strongly affected by climate (Peters 1997), with cooling-induced decreases in Fagus abundance being well represented in this and other forest modelling experiments (Campbell & McAndrews 1993). Simulations of hardwood forest succession in southern Ontario (Canada) by Campbell & McAndrews (1993), for example, indicate a decline in Fagus grandifolia beginning at ad 1400, with replacement first by Quercus and then by Pinus. This feature of the pollen record was originally attributed to native forest clearing but is now linked to LIA climate.
Based on the results of our simulations, Fagus should not dominate modern-day nemoral woodlands, whereas Tilia should be present in significantly higher than the observed quantities (Fig. 4). The weak presence of Tilia has been a long-standing characteristic of forests in north-west Europe (Iversen 1958; Turner 1962; Berglund 1969) and FORSKA2 simulations support the theory that factors other than LIA cooling must be responsible for its near disappearance in southern Scandinavia.
Anthropogenic factors such as large-scale forest clearance for agriculture, introduction of domesticated grazing animals, and selective cutting may have affected Tilia and, although selective cutting seems unlikely in light of its historical importance as a fodder species (Behre 1988), timber, furniture construction and fibre production may have favoured its selective removal. Large-scale forest clearance for agriculture would likely have enhanced Tilia’s establishment potential because of its ability to proliferate by stump sprouting (Aaby 1983). Herbivore–plant interactions may also have played a significant role in shaping the Scandinavian landscape during the late-Holocene (Bradshaw & Mitchell 1999). Field studies show that Tilia is preferentially selected by deer, elk and domesticated livestock, while Fagus is least preferred, prompting Nilsson (1997) to conclude that grazing animals (both native and domesticated) have played a more important role than climate in altering Swedish boreo-nemoral forest succession in the late-Holocene. If most herbivore pressure in the past was due to domesticated rather than native herbivores, then Tilia’s decline may have been a consequence of changing human land-use patterns and/or increasing human populations. FORSKA2 simulation data (Fig. 4), however, indicate that Tilia was already starting to decline as a result of LIA cooling when human disturbance began to intensify, thus it is possible that LIA cooling weakened existing Tilia populations, making them more susceptible to anthropogenic pressures.