Periodic growth depressions caused by LBM outbreaks were not found in the TRW and MXD chronologies from the Tatra Mountains. These findings were consistent for the past three centuries and in general agreement with previous reports by Baltensweiler et al. (1977) and Büntgen et al. (2007). The absence of cyclic larch defoliation episodes most probably results from the restriction of larch to small and isolated stands along the Carpathian arc that do not provide adequate host material for the development of outbreaks. Given the absence of outbreaks in the Carpathians, further results and discussion are herein restricted to the Alpine arc only.
The detection of LBM outbreaks was classified at six different outbreak intensity levels, ranging from a minimum of one to a maximum of six positive detection methods. The sum of the number of positive detections (i–vi) averaged over all sites and the entire Alpine area provides evidence for 295 outbreaks during the interval ad 1700–2000 (Fig. 4). Only 6 yr were found without any defoliation signal [independent of the method (i–vi) applied]. Together, the six approaches revealed outbreak years during which 1.7–98.0% of the available sites were affected. By contrast, the outbreak intensity of the highest level, exclusively based on method (vi), was found in 110 yr. During these 110 cases, 1.5–24.0% of sites were affected (Fig. 4). There is an inherent compromise between detection being too sensitive (false positives) and not being sensitive enough (false negatives); the numbers of outbreaks detected decreased systematically with increasing numbers of detection methods used as a threshold (Figs 4, 5g). Lower outbreak intensity thresholds introduce background noise, whereas higher intensity thresholds restrict outbreak identification to more distinct peak events. [See also Swetnam et al. (1985) and Swetnam & Lynch (1993) for methodological implications of different thresholds used for the exposure of insect-induced disturbance signals in tree-ring chronologies.] The highest outbreak intensity (six of six methods were positive), which was obtained from > 10% of affected series per site, was limited to only 11 yr (1726, 1753–54, 1794, 1801, 1815, 1848, 1857, 1946, 1953 and 1964). Using a threshold of only one of six positive detections (one method of six revealed an outbreak), 15 events were detected in which > 80% of the available sites were affected (1706, 1711, 1718, 1723, 1726, 1759, 1794, 1801, 1811, 1821, 1834, 1846, 1859, 1881 and 1983).
Figure 4. Summary of the six (i–vi) detection methods (DM) applied. Black lines indicate the cumulative percentage of the detected outbreaks per year. The sum of all six methods (i–vi) shows outbreak evidence ranging from 1 to 98% in 295 yr, whereas 0% outbreak evidence is found in 6 yr, and the maximum of 98% outbreak evidence is reported for ad 1881. Maximum outbreak evidence, as defined by method (vi), reaches 24% in ad 1794, whereas 0% of evidence for this highest outbreak class (vi) is found for 191 yr between 1700 and 2000.
Download figure to PowerPoint
Figure 5. Time-series of reconstructed larch budmoth (LBM) outbreaks (the cumulative percentage of the detected outbreaks per year following six intensity levels) over the 1700–2000 period and split into five geographical sub-regions: (a–e) south, west, central, east > 1500 and < 1500 m asl. (f) Site replication per sub-region and (g) outbreak patterns averaged over the entire Alpine arc. The six different colours refer to the six different outbreak intensities (ranging from low to high) that are based on the six detection methods (i–vi), as detailed in Fig. 4.
Download figure to PowerPoint
The spatial characteristics of outbreaks can be inferred by observing patterns of outbreaks at sites grouped into five geographical sub-regions (Fig. 5a–e). The coincidence of several spikes in time-series is indicative of spatial synchrony, and the last well-synchronized population peak occurred between 1981 and 1983, exactly one century after the most severe Alpine outbreak in 1881. This last Alpine-wide outbreak event is even more distinct when averaging evidence from the five geographical sub-regions (Fig. 5g). Other well-synchronized 20th century outbreaks occurred in c. 1936, 1945, 1954, 1963, 1972 and 1981, with generally more asynchronous dynamics before and after this period. Supporting evidence for a decrease in population oscillation amplitude in the early 20th century has previously been presented across the Alps (Baltensweiler, 1993a; Baltensweiler & Rubli, 1999). The timing of local LBM outbreaks, quantified on the basis of larval counts and tree discoloration, generally matches peak populations estimated using tree-ring chronologies from the same locations (Rolland et al., 2001; Nola et al., 2006; Esper et al., 2007). The detection of the seven most severe outbreak episodes identified here from the 20th century from the Southern and Western Alps is supported by Baltensweiler & Rubli (1999): 1908–09, 1935–37, 1945, 1953–54, 1962–63, 1971–72 and 1980–81. A detailed view on the three most prominent LBM outbreak cycles that spread across the Alpine arc, i.e. during the early 1960s, 1970s and 1980s, and considering independent evidence from tree ring-based outbreak reconstructions, survey-based larvae counts and observed forest discoloration, revealed temporal coherency (Fig. 6). It should be noted that such comprehensive information only exists for the mid-20th century. In addition, the existence of less significant local defoliation events during the early and mid-1990s is reported from all analyses, except Esper et al. (2007), who reconstructed outbreaks in a particular sub-alpine valley, although including data from different sites.
Figure 6. Comparison between three cycles of reconstructed larch budmoth (LBM) outbreaks (this study), counted larvae population density and discoloured forest area (both from Baltensweiler & Rubli, 1999). Data are averaged over the entire Alpine arc and shown over their common period 1960–90.
Download figure to PowerPoint
Wavelet analysis of the reconstructed (1700–2000) outbreak time-series, ranging from the site level to the grand Alpine mean, revealed insight into the persistence of defoliated cycles and their relationship to space. Figure 7 illustrates the obtained power spectra of highest intensity outbreaks (110 events during which all outbreak detection methods were positive), and low- to high-intensity outbreaks (295 events during which at least one of six methods was positive) averaged over the Alpine network. Evidence for significant power at a period of ∼8 yr was greatest when using the highest outbreak level and diminished after the inclusion of lower outbreak intensities. Examination of the temporal variability in periodicity indicated a robust 8-yr period from ∼1740 to 1820, at ∼1850 and again from ∼1930 to 1980 (Fig. 7a). The spectrum based on the lower intensity threshold was indicative of less distinct periodicity, with a significant shift of the global wavelet power towards a 32-yr period (Fig. 7b).
Figure 7. Wavelet power spectra of Alpine-wide reconstructed outbreak time-series based on: (a) 110 events during which detection method (vi) (all routines were positive) indicated an outbreak for 1.5–24.0% of affected site chronologies, and (b) 295 events during which the sum of the six detection methods indicated an outbreak for 1.7–98.0% of affected site chronologies. Contour levels are chosen so that 75, 50, 25 and 5% of the wavelet power is above each level, respectively. Black contour is the 10% significance level, using a white-noise background spectrum. Right side shows the corresponding global wavelet power spectra (black line). Broken lines denote significance, assuming the same significance level and background spectrum as indicated above.
Download figure to PowerPoint
Wavelets using site-level LBM outbreak reconstructions are not shown herein; however, they supported the ‘altitude’ hypothesis, which postulates that the most severe LBM epidemics are concentrated at a range of elevations at ∼1800 m asl (Weber, 1997). Site-specific spectra revealed a distinct periodicity at 8–9 yr from sites at elevations between 1750 and 1900 m asl (see Fig. 2 and Table S1 for the affected chronologies). Nevertheless, complex landscape geometries are likely to obscure both the direction and speed of travelling waves (Bjørnstad et al., 2002; Johnson et al., 2004, 2006), and local weather conditions may shift outbreak foci to lower or higher elevations and modulate populations at different slope exposures (Baltensweiler et al., 2008). The assumption that optimal areas characterized by more frequent and intense outbreak amplitudes would consequently shift to higher elevations in a warming world becomes particularly critical in the long term, as the upper limit of most larch forests in the European Alps ranges between 1900 and 2100 m asl. Defoliation within lower elevation suboptimal zones may result from immigration from higher elevations (Baltensweiler & Rubli, 1999).
Some caution should be exercised in interpreting these findings however, as it is possible that they are shaped by an altitudinal or spatial bias from uneven site replication throughout the five geographical sub-regions – ranging from four sites in the Southern Alps to 23 sites in the Central Alps (see Fig. 5f for details on sample size). The few high-intensity outbreaks reconstructed from the low-elevation cluster (Fig. 5a) simply result from the fact that only four of the 15 sites could realistically be compared with climate data. The remaining low-elevation chronologies that did not correlate at r ≥ 0.25 with temperature (see Fig. 3 for details) consequently were unable to yield reconstructed outbreaks of higher intensity (see Materials and Methods section above). Methodological caution is further advised because the running minima approach may interpret any negative growth extreme as an artificial defoliation event. In this regard, it should be noted that a combination of the six outbreak detection methods (i–vi) most probably represents the best tool to distinguish annual growth depressions caused by LBM from other disturbance factors, such as climate anomalies or more local disturbances caused by logging, rock fall and lightening that are quite common at higher elevations.
Reconstructed defoliation was not perfectly synchronized across the entire Alpine arc, and this indicates a lack of significant climate forcing on the detection method(s) applied. That is, cyclic outbreak episodes obtained via negative growth anomalies mismatch any particular temperature regime that occurred across the Alps. There often is a danger of circular reasoning in tree ring-based reconstructions of insect defoliation (Kress et al., 2009), which has been detailed for the western spruce budworm in the USA (Ryerson et al., 2003). For example, negative growth anomalies may result from below average summer temperatures, but spuriously be interpreted as population outbreaks. Overlapping temperature depressions and insect defoliation are thus difficult to distinguish, although the applied nonhost comparison probably reduced this risk. Increases in growing season temperatures, which are known to stimulate TRW and MXD at the upper timberline, would yield positive growth anomalies (Büntgen et al., 2005, 2006b; Frank & Esper, 2005a,b), subsequently compensating for defoliation-induced growth interruptions (Baltensweiler et al., 2008). The possibility of climate-driven cycles in population dynamics is unlikely, given that regular ∼8–10-yr oscillations have not been reported from climatic observations. Various methodological tests on the detection of outbreak-induced growth depressions in tree-ring data have been conducted in the northern USA using host spruce budworm (Swetnam et al., 1985; Morin et al., 1993; Swetnam & Lynch, 1993; Ryerson et al., 2003) and pandora moth data (Speer et al., 2001). All of these studies stressed the importance of clearly separated climatic and ecological information (Arabas et al., 2008). Moreover, a tendency for increasing error back in time might derive from: (1) a general reduction in sample size at the site level; (2) an overall decline in site chronology at the network level; (3) a less homogeneous spatial site distribution across the network; and (4) a more insecure temperature record during the 17th and 18th centuries (Frank et al., 2007a), and particularly before c. 1760 when documentary evidence dominated (Casty et al., 2005).
The assessment of seasonally resolved Alpine temperature variability denoted the last two decades to be a period during which temperatures in all seasons were well above the 20th century mean. This warming parallels the disappearance of LBM outbreaks in a 1200-yr cyclic reconstruction from the Swiss Alps (Esper et al., 2007), and a severe dampening in synchronized Alpine-wide defoliation (Baltensweiler, 1993a). At the same time, fading population cycles in voles and grouse have been associated with global warming (Ims et al., 2008). The recognition of drifting in and out cyclic dynamics could, to some extent, be data and methodologically driven, and thus be prematurely linked to climatic warming (Parmesan, 2007). By contrast, the widespread and simultaneous absence of high-amplitude cyclic population densities cannot be related to local and methodological factors only. Uncertainty in our understanding of potential oscillation dampening since about the 1980s interestingly derives from a period of reduced data availability (Fig. 2b) (see Frank et al., 2007a for a discussion). Evidence for the weakening of the LBM oscillations derives from exhaustive survey data that are lacking into the late 20th century (Baltensweiler & Rubli, 1999), and a spatially constrained study in the Swiss sub-alpine Lötschental (Esper et al., 2007).
Both TRW and MXD include valuable information necessary for defoliation reconstruction; TRW contains a higher degree of biological memory, whereas MXD is characterized by less autocorrelation and recovers faster from any disturbance (Esper et al., 2007; Frank et al., 2007a). Mean lag-1 autocorrelation of the Alpine TRW site chronologies utilized herein is 0.53, with generally higher values deriving from higher elevation sites, and overall lower values being reported for MXD (see Table S1 for site-specific information). Carry-over effects that mainly bias earlywood cell formation (Fritts, 1976) can yield towards a slight delay of detected extremes (Fig. 6). This is related to integrative effects from previous year climatic and ecological conditions on TRW formation, which can be further obscured by longer term gain (loss) in activating resources from root and needle growth following favourable (severe) conditions (Frank et al., 2007a). Only the consequent coexistence of both TRW and MXD measurements allows defoliation-induced vs climate-induced persistence in cell development and enlargement to be distinguished (Esper et al., 2007).
Additional insights on abiotic forcing may also originate from stable δ13C- and δ18O-isotope ratios, which have recently been reported to be less prone to insect defoliation (Kress et al., 2009). Negligible effects of LBM outbreaks on stable isotopic ratios indicate that defoliation is independent of changes in leaf physiology, known to modulate carbon and oxygen values (Leavitt & Long, 1988). Rather than reflecting changes in stomatal conductance and photosynthetic capacity, δ18O composition of cellulose may be shifted towards the isotopic signature of source water, as little leaf water enrichment occurs during defoliation. The synchronous behaviour of both carbon and oxygen isotopes during LBM defoliation events partly supports the hypothesis that isotope composition is climate controlled (Kress et al., 2009). During outbreak years, tree-ring cellulose is formed either before the devastating feeding occurs or shortly thereafter, when host trees re-foliate roughly 1 month after defoliation (Baltensweiler et al., 2008). The timing of late summer re-foliation, which theoretically could account for a positive relationship between summer warmth and LBM outbreaks, has been assessed recently (Kress et al., 2009). Such a relationship would also reflect the previously postulated effect of cold season temperatures on LBM diapause and, subsequently, population growth (Baltensweiler, 1993b). The development of first-stage larvae in spring is limited by the energy provided at the time of oviposition in the previous summer. Optimal conditions for first instars are long and cold winters with more than 120 d below 2°C (Baltensweiler, 1993b). If summer temperatures are high, development from egg to moth is terminated sooner, leading to an earlier diapause with fewer frost days, resulting in an elevated egg mortality plus increased larval mortality when hatching before host needle flushing (Baltensweiler et al., 1977). In addition, larch foliage quality for various larval stages has been implicated to play a central role in population oscillations (Baltensweiler, 1993b). As a result of the yearly re-growth of needles, LBM larvae depend on needle maturation and have to cope with large changes in food quality (e.g. raw fibre and protein content). Seasonal synchronization of larval development with needle maturation is a requirement for population growth and outbreak development (Asshoff & Hattenschwiler, 2006). Even a slight temporal offset between larval and foliar development may have serious consequences on LBM population growth. Warmer summer temperatures may also affect the relative timing of LBM and its parasitoids, with implicit effects on population dynamics.
Several different mechanisms have been suggested to explain LBM oscillations, such as behavioural changes in population quality (Baltensweiler, 1993a), insect–disease interactions (Anderson & May, 1980), induced host defences (Fischlin, 1982) and host–parasitoid dependences (Turchin et al., 2003). As discussed above, there are many possible mechanisms by which climate may alter LBM dynamics, but there remains considerable uncertainty with regard to how populations are modulated by climate change (Esper et al., 2007). We point out parenthetically that no indication of thermal effects on outbreak intensity and periodicity was evident for the 1940s, a decade during which spring and summer temperatures were comparable with those of the late 20th century (Auer et al., 2007). By contrast, early 20th century winter temperatures were much cooler than during recent times, even though considering some degree of bias inherent to early instrumental station measurements (Frank et al., 2007a). Unprecedented high temperatures amongst all seasons are a unique feature of the past two decades only.
We provide 301 maps that describe annual patterns of Alpine-wide LBM outbreak dispersal (Fig. S1a–i, see Supporting Information). A total of 29 site chronologies is available in 1700, and replication ranges from 18 sites in 2000 to 67 sites between 1911 and 1915. The number of LBM outbreaks per year ranges from 0% (in 6 yr) to 98% (in ad 1881). Figure 8 summarizes the spatial and temporal aspects of the last Alpine-wide outbreak that occurred in the early 1980s. Although four maps illustrate the distribution and intensity of the defoliated sites per year, the chronological evolution from 1981 to 1984 is further highlighted by the simple sum or cumulative percentage of the positive cases derived from the six detection methods (i–vi). Considering the full 1700–2000 period of evidence, distinct periodicity is reconstructed for the mid-18th century, the transition from the 18th to 19th century and again for the mid-20th century. Less intense and/or spatially less synchronized outbreaks appeared at the transition from the 19th to 20th century. Moving correlation analysis computed between the 70 reconstructed LBM outbreak time-series of the site level provides some indication for temporal changes in outbreak synchrony across the Alpine arc and over the 1700–2000 period (Fig. 9a). Strongest agreement is found during the early 18th century, ∼1800, ∼1840 and during the mid-20th century. Lowest correlations are obtained in the early 20th century. Similar correlation analysis using the five reconstructed LBM outbreak time-series that represent the geographical sub-regions (Fig. 9b), or using the six records that reflect the different outbreak intensity levels [following detection methods (i–vi)] (Fig. 9c), show similar changes in synchrony over time. Caution, however, is advised during the calculation's start and end periods, where site replication is low. Moreover, the spatial data coverage of our network merely reflects sampling activities of the various data contributors rather than an even geographical and altitudinal distribution. Site location was weighted towards the western Swiss Alps, two centres in the Italian Alps and one lower elevation cluster in the Eastern Austrian Alps. To obtain a more uniform distribution of samples, more data would be needed from the south-west French and Italian Alps, along the entire northern pre-Alps, some parts of the Central Alps and from higher elevations in Austria and generally lower elevations everywhere else.
Figure 8. Annually resolved maps of the last Alpine-wide synchronized larch budmoth (LBM) outbreak event during 1981–84. Thin black triangles show the existing site chronologies per year, and colours refer to the reconstructed outbreak intensity ranging from heavy (purple) to low (grey). The six different colours are based on the six detection methods (i–vi), as detailed in Fig. 4. The corresponding numbers at the bottom right of the figures summarize each year's data availability and outbreak intensity, with the bottom graphs describing these numbers (outbreak sum per intensity level and cumulative percentage of the intensity levels) over time. Annual maps for 1700–2000 are provided in Fig. S1 (see Supporting Information).
Download figure to PowerPoint
Figure 9. Moving correlation analysis computed between: (a) 70 larch budmoth (LBM) outbreak reconstructions of the site-level (full line), (b) between five LBM outbreak reconstructions of the geographical sub-regions (black broken line), and (c) between six LBM outbreak reconstructions of different intensity levels (grey broken line). Correlations are computed over 30-yr windows and lagged by 15 yr along the individual site records (a), and continuously shifted by 1 yr along the time-series of the geographical sub-regions (b), and those of the different intensity levels (c). Site replication over the full 1700–2000 period is indicated by the bottom solid grey line.
Download figure to PowerPoint
Conclusions and perspective
We compiled TRW and MXD measurement series from 70 larch host and 73 spruce nonhost sites distributed across the European Alps and Tatra Mountains spanning elevations from 500 to 2300 m asl. This unique network integrates living trees from six countries and historical wood from the Swiss Alps. The sample size is robust, i.e. allows spatial patterns to be analysed for more than three centuries.
Six LBM outbreak detection methods were applied to distinguish negative growth depressions caused by insect defoliation from depressions caused by climatic, ecological or other biotic factors. The first three methods were performed on the basis of the individual measurement series, whereas the remaining methods were applied on the mean site chronologies. These include the calculation of residuals between host (larch) and nonhost (spruce and climate) surrogates, and running minima analysis. The outbreak events obtained were classified into six intensity levels. Comparison with seasonal temperature means was performed to evaluate potential climatic influences on both the detection methods applied and reconstructed LBM outbreaks. For the first time, annual maps of reconstructed LBM outbreak intensity were developed for the Alpine arc dating back to ad 1700.
These reconstructions indicated the existence of synchronized 20th century outbreak pulses at c. 1936, 1945, 1954, 1963, 1972 and 1981, with generally more spatial heterogeneity being characteristic before and after this period. Robust 8-yr cycles were detected at c. ∼1740–1820, ∼1850 and again from ∼1930 to 1980. Wavelet spectra based on site-level reconstructed outbreak series indicate distinct periodicity at 8–9 yr for sites in the Western Alps between 1750 and 1900 m asl. The combination of six methods sufficiently distinguished between annual growth depressions caused by LBM and other disturbance factors, such as climate anomalies. Seasonally resolved Alpine temperature variations indicated long-term warming during winter, spring and autumn, but stationary summer temperatures over the past three centuries. Unprecedented warming in all seasons characterized the post-1980 period. Distinct outbreak periodicity is mapped for the first decades of the 18th century, the transition from the 18th to 19th century and again from the mid-20th century until the early 1980s. Less intense and/or spatially less synchronized outbreaks appeared during the mid-18th century, the transition from the 19th to 20th century and during the recent warming. Local persistence, but spatial heterogeneity, in cyclic population dynamics is characteristic for the Alpine arc during the past three centuries. However, more data and revised methodologies are necessary to enhance the estimates of long-term outbreak dynamics.
Based on these results, we suggest that future research on the LBM system should investigate: (1) altitudinal dependence in outbreak intensity, timing, phase angle and climate forcing; (2) relationships between tree growth and needle length as the main food supply, with particular emphasis on effects of host quality on cycle occurrence and spatial dispersal; (3) parameter-specific, i.e. TRW, MXD, defoliation responses at differing site locations and ecology; (4) outbreak effects on tree-ring isotopes; and (5) improved separation between insect- and climate-induced fingerprints via model simulations. Increasing awareness and collaboration between biologists, ecologists and climatologists will improve our understanding of responses in population dynamics and forest communities to climate variability and change. Moreover, mutual attention in assessing the linkage between (internal) responses of ecosystems and (external) climate forcing is an impetus for launching interdisciplinary research.