Human activity and anomalously warm seasons in Europe

Authors


Abstract

Seasonal mean temperatures averaged over the European region have warmed at a rate of 0.35–0.52 K/decade since 1980. The last decade has seen record-breaking seasonal temperatures in Europe including the summer of 2003 and the spring, autumn, and winter of 2007. Previous studies have established that European summer warming since the early twentieth century can be attributed to the effects of human influence. The attribution analysis described here employs temperature data from observations and experiments with two climate models and uses optimal fingerprinting to partition the climate response between its anthropogenic and natural components. These responses are subsequently combined with estimates of unforced climate variability to construct distributions of the annual values of seasonal mean temperatures with and without the effect of human activity. We find that in all seasons, anthropogenic forcings have shifted the temperature distributions towards higher values. We compute the associated change in the likelihood of having seasons whose temperatures exceed a pre-specified threshold. We first set the threshold equal to the seasonal temperature observed in a particular year to assess the effect of anthropogenic influences in past seasons. We find that in the last decade (1999–2008) it is extremely likely (probability greater than 95%) that the probability has more than doubled under the influence of human activity in spring and autumn, while for summer it is extremely likely that the probability has at least quadrupled. One of the two models employed in the analysis indicates it is extremely likely the probability has more than doubled in winter too. We also compute the change in probability over a range of temperature thresholds which enables us to provide updates on the likely change in probability attributable to human influence as soon as observations become available. Such near-real time information could be very useful for adaptation planning. © Crown Copyright 2010. Published by John Wiley & Sons, Ltd.

1. Introduction

Observational evidence suggests that Europe has experienced some unusually warm seasons in recent years. The spring of 2007 was the warmest on record since 1880 (Rutishauser et al., 2008), the autumn and winter of 2007 were the warmest in more than 500 years and possibly also the warmest in the last millennium (Luterbacher et al., 2007), while in 2003, the mean summer temperature soared more than 5 standard deviations above the 1961–1990 mean (Schär et al., 2004; Beniston 2004). An observed rise in seasonal European extreme temperatures has also been reported (Scherrer et al., 2005; Xoplaki et al., 2005; Moberg et al., 2003; Luterbacher et al., 2007). Special emphasis has been placed on summer heat waves after the catastrophic event of 2003 which cost the lives of tens of thousands of people (Robine et al., 2008 estimate more than 70 000 excess deaths). Della-Marta et al. (2007) suggest that heat waves in Europe and the western Mediterranean have increased in length by a factor of two since 1880, while the frequency of hot days has increased in the same period by a factor of three. Kuglitsch et al. (2010) recently reported an increase by at least a of factor 6 in the mean heat wave intensity, heat wave length, and heat wave number across the eastern Mediterranean over the past 5 decades.

There is a host of socio-economic and environmental impacts associated with the warming of the European climate. One example is the lengthening of the growing season (Linderholm, 2006) which influences not only agriculture (Viner et al., 2006), but also plant phenology (Sparks and Menzel, 2002; Menzel et al., 2006) and animal phenology (Walther et al., 2002). There is also growing concern about a potential increase in the frequency of heat waves, because of the stress that such events exert on human health (Kovats and Jendritzky, 2006). An examination of the impacts shows that adaptation challenges may be different across the continent. For example, years with warmer winters in central Europe adversely affected the alpine tourism (e.g. Elsasser and Burki, 2002), while warmer summers in the Mediterranean region led to droughts and water shortage (Iglesias et al., 2007). Although the majority of the impacts associated with a warmer climate in Europe are adverse, there are also some positive outcomes, including reduced winter mortality (Keatinge et al., 2000) and improved conditions for tourism in northern and western Europe (Hanson et al., 2006). Alcamo et al. (2007) provide a detailed review of the impacts that have emerged in Europe as a result of the recent warming together with model projections of future changes.

Detection and attribution can help investigate why Europe has been experiencing significantly warmer seasons in recent decades. The observed changes stem from the combined effect of internal climatic fluctuations and natural and anthropogenic external forcings. The 4th Assessment Report (AR4) of the Intergovernmental Panel on Climate Change (IPCC; Hegerl et al., 2007) identified human influences as the main driver of the global temperature change, while subsequent research has advanced the understanding of regional aspects of anthropogenic changes in extremes (Stott et al., 2010). So far, the anthropogenic fingerprint has been detected and found to be the primary component of the recent warming in all the continents (Stott, 2003; Zwiers and Zhang, 2003; Gillett et al., 2008). The effect of anthropogenic forcings has also been detected in certain types of daily extremes on a quasi-global scale (Christidis et al., 2005), while a two-step attribution analysis showed that anthropogenic forcings have at least doubled the likelihood of a summer season in Europe at least as hot as the one in 2003 (Stott et al., 2004). Formal attribution methodologies have recently been applied in two studies of climate change impacts in Europe. The first managed to detect significant human contribution to the lengthening of the growing season length (Christidis et al., 2007) and the second suggested that adaptation plays a key role in temperature-related mortality decreases in England and Wales (Christidis et al., 2010a).

The attribution studies referenced above have so far considered only changes in the summer mean temperature or in longer time means. This study aims to attribute observed changes in seasonal European temperatures since the early twentieth century to possible causes and estimate how human influences may have altered the frequency of anomalously warm seasons. Here we consider temperature changes not only in summer but in each season. We include more recent years in our analysis than previous investigations (up to 2008) and employ optimal fingerprinting to investigate possible causes for the changes. We then look at seasonal extremes and, following the approach of Stott et al. (2004), quantify the change in risk of seasonal temperatures exceeding pre-specified thresholds that can be attributed to human activity. These attributable risks are pre-computed over a range of possible temperature values and therefore our results can be used to assess the impact of anthropogenic forcings on the seasonal temperatures in ‘real time’ as soon as a new observation becomes available.

The remainder of the paper is structured as follows. Section 2 gives information on the data used in this work and then describes the methodology. Results are presented in Section 3 and the main findings are discussed in Section 4.

2. Data and methodology

In our analysis we use observations of the seasonal European land 1.5 m temperatures from the CRUTEM3 gridded dataset (Brohan et al., 2006) updated to year 2008 together with temperature data from coupled (Atmosphere-Ocean) General Circulation Model (GCM) experiments. The model dependency of the results was assessed by using two different GCMs: the Hadley Centre HadGEM1 model (Martin et al., 2006) and version 3.2 of the MIROC coupled GCM (K-1 Model Developers, 2004). These two models are chosen for our analysis as they have ensembles of runs with different external forcings extended to year 2008. HadGEM1 has an equilibrium climate sensitivity of 4.4 K (Stott et al., 2006a) and MIROC of 4.0 K (Yokohata et al., 2005). The models used in the AR4 have climate sensitivity values between 2 and 4.5 K with a best estimate of about 3 K, which means that both the models we employ here have relatively high sensitivities. It should be noted, however, that the equilibrium sensitivity value of HadGEM1 reported in the AR4 was calculated with a slab-ocean model and its transient climate response of 1.9 K is well within the range of the other models (1.2–2.6 K). Whether the use of high sensitivity models is likely to introduce a bias into our analysis will be discussed later in the paper. Two types of experiments are considered. The first simulates the effect of anthropogenic and natural forcings that have been acting on the climate since pre-industrial times (ALL). The second considers anthropogenic forcings only (ANTHRO). Details of the experiments can be found in Johns et al., (2006); Stott et al., (2006a); Nozawa et al., (2005). Each HadGEM1 experiment comprises four ensemble simulations and each MIROC experiment ten simulations. The ensemble runs start from well separated points of long control simulations without any external forcings. The climate response to the forcings considered in each experiment is represented by the ensemble mean. The effect of the natural forcings (NAT), i.e. solar variations and volcanic eruptions, can be inferred by the difference between the ALL and ANTHRO response. Internal climate variability is estimated using control simulations. In total we use 900 years of the HadGEM1 control run, 3600 years of the MIROC run and we also complement these with 2500 years of the control simulation with the HadCM3 model (Johns et al., 2002) to get a wider basis for the variability estimate. The analysis covers land areas over the European continent (30°W–50°E, 30°N–80°N) extracted from the observed and model temperature fields. The model data are masked by the observations to ensure all temperature fields have the same coverage. Changes in the seasonal temperatures are examined over the period 1904–2008. We take 1904 (and not 1900) as the starting year, as in the attribution analysis we divide the period into 5-year segments. To incorporate changes in most recent years we extended the ALL and ANTHRO runs to 2008. The anthropogenic emissions after year 2000 come from the SRES A1B scenario (Nakicenovic and Swart, 2000). The volcanic aerosols after year 2000 in the ALL experiments follow an exponential decay. HadGEM1 simulates an average 11-year solar cycle post-2000, while MIROC uses satellite measurements of the solar irradiance (http://www.pmodwrc.ch/).

Figure 1 shows time series of the observed seasonal mean temperatures over Europe together with the ALL and NAT components of the climate response to external forcings from the two GCMs. In this work, temperatures are represented by anomalies relative to the 1901–1930 mean. In that way, we reference the temperatures relative to the earliest part of the twentieth century, when anthropogenic influences on the climate were smaller than during the more commonly used base period of 1961–1990. We also make use of a common annotation to reference the seasons using the initial letters of the months they include (DJF, MAM, JJA, and SON). The time series in Figure 1 are smoothed for clarity using 5-year running means, whereas unsmoothed time series are also plotted for the observations in order to illustrate the effect of year-to-year variations. The observed winter temperatures in the time series without any smoothing applied display more variability than other seasons, consistent with the stronger westerly Northern Atlantic influence in winter months (e.g. Hurrell et al., 2003 and references therein). In agreement with the observations, the ALL time series show a marked post-1980s warming in all seasons, which is weaker or not evident in the NAT. The MIROC response to ALL forcings, however, shows a more moderate warming in recent decades than both HadGEM1 and the observations. The weaker warming in MIROC relative to HadGEM1 is consistent with the lower climate sensitivity of this model. The observed rate of warming since 1980 is 0.49, 0.46, 0.52, and 0.35 K/decade for DJF, MAM, JJA, and SON respectively. Using thirty independent 29-year-long segments from the control experiment, we find that the standard deviation of the trend in an unforced climate is 0.18, 0.11, 0.05, and 0.10 K/decade for DJF, MAM, JJA, and SON, respectively. The observed trends in the seasonal temperature are therefore more than 2.5 times higher than the standard deviation (more than 10 times for JJA). The simulated ALL time series in Figure 1 also seem to suggest that the recent warming does not emerge gradually above internal variability, but is characterised by a sharp increase in the temperatures in the 1990s. This could be indicative of the stronger cooling effect of aerosols in Europe in earlier decades compared to less industrialised regions or the global mean, where the warming (not shown here) is expected to intensify in a more gradual manner.

Figure 1.

Time series of the seasonal mean temperature anomalies relative to 1901–1930 in Europe. Each panel corresponds to the season noted in the header. Time series are shown for the observations (black thin lines) and for the response to all external (red) and natural only (blue) forcings computed from the ensemble mean of HadGEM1 (solid lines) and MIROC (dashed lines) experiments. The ALL and NAT time series (coloured lines) are smoothed using 5-year running means. Smoothed time series are also shown for the observations (grey thick lines)

The first part of the analysis employs optimal fingerprinting to attribute the changes in the seasonal temperatures during period 1904–2008 to possible causes. The method was adapted for climate science by Hasselmann (1979) and the number of climate-change studies on fingerprint detection started to grow in the 1990s (Santer et al., 1993; Hegerl et al., 1996; Hegerl et al., 1997). Optimal fingerprinting has gained so much popularity that it largely underpins the IPCC attribution statements linking climate change to human activity (Hegerl et al., 2007). For a detailed description of the method, the reader can refer to Allen and Tett (1999) and Allen and Stott (2003). Here we carry out separate analyses for each model, using the fingerprints of the ALL and ANTHRO forcings, represented by the ensemble mean of the corresponding model experiments (xALL and xANTHRO), and scale them to the observations (y). The vectors xALL, xANTHRO and y in our analysis are time series of the 5-year mean seasonal temperature anomalies averaged over Europe during 1904–2008, i.e. they consist of 21 values of the 5-year mean temperatures. We therefore construct the fingerprints as vectors in time only, similar to the study of Stott et al. (2004). The model fields are masked with the observations to ensure the same coverage is used before they are averaged over Europe. We also decompose the response between the anthropogenic (xANTHRO) and the natural (xNAT) components, assuming that xALL is the linear combination of the two:

equation image(1)

The scaling factors for the anthropogenic and natural components are βANTHRO = β1 + β2 and βNAT = β1, respectively. The noise term u0 represents internal climate variability and its variance-covariance matrix is derived from control simulations as follows: We extract several 105-year-long segments from the control simulations of the models used in this study, mask the annual fields of the seasonal mean temperatures with the observations to ensure they have the same coverage, group each segment into 5-year means, and compute the average over the European area, which gives us time series of 21 consecutive 5-year mean values of the seasonal mean temperature in a climate without external forcings for each segment. The time series from all control segments provide the variance-covariance matrix of u0. For the HadGEM1 analysis, we use control segments from HadGEM1 and HadCM3 to determine u0, whereas for the MIROC analysis we use control segments from MIROC and HadCM3. A standard consistency test (Allen and Tett, 1999) is used to assess whether the residual variability, u0, is consistent with the variability estimated from the control simulations. The noise terms u1 and u2 represent sampling noise due to the limited size of ensembles that give the model fingerprints. Assuming that the dominant source of sampling noise is internal variability, the covariance structure of the sampling noise is calculated by scaling the variance-covariance matrix of u0 (Allen and Stott, 2003). This scaling is determined by the size of the ensemble, and in the case of a single model run, the sampling noise would be the same as u0. The analysis is carried out in the space defined by the leading Empirical Orthogonal Functions (EOFs) of the control variability. As in previous applications, we choose the EOF truncation to be high enough to give a good representation of the variability, while making sure that the residual test passes successfully and the scaling factors are stable in the spectral vicinity of the chosen truncation, i.e. they vary little if we choose another truncation within ± 2 EOFs. We typically retain about 14 EOFs, which in all cases (i.e. for all seasons and models) explain more than 80% of the variability in the control climate. The residual variability lies within the 16th and 54th pecentiles of the range of internal variability as estimated from the control.

Optimal fingerprinting outputs estimates of the scaling factors, βANTHRO and βNAT, which measure the level of consistency between the modelled and the observed response and indicate a detectable signal when their 5–95% uncertainty range does not include zero. The uncertainty range is determined with an F-test using an estimate of the variance in the scaling factors based again on segments of the control simulation (Allen and Tett, 1999). We adopt the common practise of using different control segments for the estimation of the variance covariance matrix of u0 and the uncertainty in the scaling factors. The scaling factors can then be applied to noise-free estimates of the corresponding fingerprints (5-year mean values of the seasonal mean European temperature during 1905–2008) to give the components of the response attributed to the anthropogenic and natural forcings. Given the uncertainty in the scaling factors, the result is distributions rather than single estimates of the attributable components. The scaling of the model fingerprints is a statistical way of introducing observational constraints in order to construct optimised response patterns, which are, in principle, more accurate than the raw model output. This scaling is also expected to reconcile differences between models with different climate sensitivities. For example, models that simulate more anthropogenic warming that the observations will have their anthropogenic fingerprint scaled down and vice versa. Therefore, even in the case when two models with relatively high climate sensitivities are employed, the results of the analysis are not expected to be biased.

In the second part of the analysis we construct the distributions of the seasonal mean European temperatures for a climate with and without the influence of anthropogenic forcings. The aim is to construct the probability density function (PDF) that describes the range of seasonal mean European temperature for each season and each decade during the reference period expected in a climate forced with all external forcings, and also in a climate forced with natural forcings only. This enables a comparison between the ‘actual’ world and a ‘natural’ world where the impact of human activity is absent. The two distributions can then be used to infer the fraction of attributable risk (FAR, Allen, 2003; Stott et al., 2004; Stone and Allen, 2005; Christidis et al., 2010b). The FAR measures the change in the likelihood that the temperature exceeds a certain threshold in the actual world relative to the natural world. If P0 and P1 denote the probability of exceeding the threshold without and with the effect of anthropogenic forcings, then FAR is computed as:

equation image(2)

If looking at seasonal extreme events (i.e. for cases where the threshold lies in the tail of the distribution), the probabilities in Equation (2) need to be computed using the extreme value theory. Here we construct the actual and natural world seasonal temperature distributions and also distributions of the FAR following the methodology of Stott et al. (2004). The methodology is summarised in the five steps listed below:

  • 1)An optimal fingerprinting analysis is carried out for each model separately combining observations and model fingerprints from the ALL and ANTHRO experiments constructed, as already mentioned, as time series of 5-year mean seasonal temperature anomalies averaged over Europe during 1904–2008. The analysis separates the anthropogenic and natural components of the response (Equation (1)). It should be noted that although we will later focus on decadal changes, we carry out the attribution analysis on 5-year rather than decadal means. The higher temporal resolution can potentially help the regression discriminate between forcings and provide more robust attribution results.
  • 2)Using the scaling factors from step 1, the ANTHRO and NAT components of the response (given by the time series of the 5-year mean seasonal temperatures averaged over Europe) are estimated by applying the factors to the simulated fingerprints to make them best match the observations. This scaling is also taking into account the effect of sampling noise as our algorithm employs total least squares regression (for details, see Allen and Stott, 2003). Our optimal fingerprinting algorithm provides 105 possible values of the scaling factor for each fingerprint to account for the uncertainty. These are generated by sub-sampling each percentile of the distribution over a 1000 equally spaced values and give us 105 realisations of the scaled time series. The average of two consecutive points in the time series computed for each component (ANTHRO and NAT) measures the model response in a particular decade, for example, the mean of the last two points gives the response in decade 1999–2008. Therefore, if we consider changes in 1999–2008, we end up with 105 values of the mean seasonal temperature for this decade over Europe, for each season, and for each component. The convolution of the two components which is done simply by adding each of the 105 values of the ANTHRO response to the corresponding NAT ones, provides 105 values for the ALL response. These large sets of values we get for the ALL and NAT responses can be used to construct the distributions of the response components with and without the effect of anthropogenic forcings.
  • 3)Using annual values of the European seasonal mean temperatures from the control simulations, we compute the distributions of the seasonal temperatures in an unforced climate, which represent the effect of internal climate variability. We use 900 years from HadGEM1, 1100 years from MIROC, and 1100 years from HadCM3, i.e. 3100 annual values for each season in total. The spatial fields of the seasonal mean temperature for each year from the control are masked to have the same coverage as randomly selected years during 1904–2008 in the observations. Anomalies are computed by splitting the control simulations into centuries and subtracting the mean of the first 30 years from each year. The spatial fields are then averaged over Europe to provide the annual values of the seasonal mean temperatures. The resulting 3100 values for each season provide the distribution of the seasonal temperature without the effect of external forcings.
  • 4)The convolution of the ALL and NAT distributions of the decadal response from step 2 with the distribution of the year-to-year unforced variability from step 3 gives the distributions of the seasonal temperature in an actual and natural world that describe the response of individual years in the decade under investigation. The convolution of the forced (ALL or NAT) response with the unforced response is done by simply adding each of the 105 values of the seasonal mean temperature (step 2) to each of the 3100 values from the control (step 3). The resulting 3.1 × 108 values provide the distributions for the annual values of the seasonal mean temperature for each season, with and without the effect of anthropogenic forcings.
  • 5)We use the distributions of the annual values of the seasonal mean temperatures to calculate the FAR (Equation 5). The FAR and its uncertainty are computed in a way similar to step 4 as follows: we add each of the 105 values of the ALL and NAT attributable change from step 2 to the 3100 values of the unforced change from step 3. This leads to 105 pairs of distributions for the actual and natural world for which 105 estimates of the FAR are computed. These estimates are then used to construct the PDF of the FAR.

As in Stott et al. (2004), the probabilities in Equation (2) are computed by approximating the temperature distribution by a normal distribution for small thresholds and a generalised Pareto distribution (GPD) for thresholds located in the tails. The switch from normal distribution to GPD occurs when the threshold lies below the 5th, or above the 95th percentile of the distribution.

3. Results

We first present results from optimal fingerprinting analyses which provide the scaling factors for the ANTHRO and NAT components of the temperature response for each model and each season individually. As already mentioned, the observational vector and the fingerprints used in the analyses are time series of 5-year mean values of the seasonal temperatures averaged over Europe during 1904–2008. The scaling factors and their 5–95% uncertainty range are illustrated in Figure 2. The uncertainty associated with the NAT scaling factors is larger as the signal is weaker. The ANTHRO signal is detectable in all seasons with both models, but the weaker NAT signal is only detected in summer temperatures with HadGEM1, and in autumn temperatures with both models. The influence of the natural forcings on summer temperatures was not detected in Stott et al. (2004) who employed HadCM3, considered only the twentieth century response and examined only the greater Mediterranean region. Repeating the present analysis for 1900–1999 with HadGEM1, however, shows that the different analysis period does not prevent the detection of the NAT signal. Consistent with our study, Jones et al. (2006) detected both the ANTHRO and NAT effect on summer temperatures in the Mediterranean region and northern Europe for an analysis with HadGEM1 over period 1907–2006. Therefore, the detection of the natural influences with HadGEM1 in JJA appears to be insensitive to small changes in the analysis period, but natural influences are not detected with MIROC. Although the MIROC scaling factors indicate detection of both ANTHRO and NAT influences on the SON temperatures, the detection of the natural signal is not robust. Firstly, the uncertainty in the NAT scaling factor is very large (Figure 2) and secondly, unlike the rest of the cases, the SON results we get with MIROC are sensitive to the exact number of EOFs used to represent the variability space.

Figure 2.

Scaling factors and their 5–95% uncertainty range from optimal fingerprinting analyses that decompose the response into its anthropogenic (solid bars) and natural components (dashed bars). Each panel corresponds to the season noted in the header and illustrates scaling factors computed with HadGEM1 (left half) and MIROC (right half). Scaling factors that are not consistent with zero indicate a signal detectable in the observations

The scaling factors from optimal fingerprinting are next applied to the ANTHRO and NAT fingerprints and the resulting components of the climate response to external forcings are illustrated together with the observations in Figure 3. An estimate of the unforced variability in the 5-year mean seasonal temperatures is also shown in Figure 3 to enable a simple signal to noise comparison. This uncertainty is represented by the 5th and 95th percentiles of a normal distribution constructed based on a large set of 5-year mean temperature values from the control simulations. The components of the response are well separated in most cases in recent decades, and the anthropogenic component rises above the 95th percentile of the internal variability. The large scaling applied to the natural component of MIROC for autumn (Figure 2) seems to force a good agreement with the observed warm autumns during 1920–1970, although the NAT detection, with significant factors greater than 1 (unlike for the other seasons) could be an artefact of the statistical analysis, rather than evidence that natural forcings are only significant in SON.

Figure 3.

Time series of the observed 5-year mean temperature anomalies relative to 1901–1930 (black lines) plotted together with the best estimate of the anthropogenic (red lines) and natural (blue lines) components of the climate response from optimal fingerprinting analyses. The shaded areas around the two components mark the 5–95% uncertainty range. Each panel corresponds to the season and model noted in the header. The horizontal dotted lines mark the 5th and 95th percentiles of the distribution of 5-year mean seasonal temperatures in a climate without the influence of any external forcings estimated from control model simulations. This figure is available in colour online at wileyonlinelibrary.com/journal/joc

We next construct distributions of the seasonal temperature with and without the influence of the anthropogenic forcings. These come from the convolution of the underpinning response to the ALL and NAT forcings in the decade for which the distributions are constructed and the unforced variability in the annual values of the seasonal temperatures. The former is computed from the combination of two consecutive points in the time series shown in Figure 3 (ALL is the convolution of the ANTHRO and NAT components) and the latter is estimated from control simulations as explained in step 3 of the methodology described in Section 2. The PDFs of the annual values of the seasonal temperatures in the last decade of the reference period (1999–2008) are shown in Figure 4. Our methodology optimises the model response by introducing observational constraints and, therefore, the plotted PDFs are deemed to be preferable to the distributions that the raw model output would give. The distributions we obtain based on the two models are very similar. The winter PDFs are the broadest because of the effect of internal variability which is most prominent in that season (Figure 1). The summer PDFs have the least overlap, indicating that the impact of human activity is more evident in this season. Nevertheless, in all cases the ALL distribution is shifted towards higher temperatures, an illustration of the warming effect of anthropogenic forcings on European seasons in recent years.

Figure 4.

Distributions of the annual values of the seasonal temperature anomalies during period 1999–2008 in Europe. The anomalies are relative to the 1901–1930 mean value. Each panel corresponds to the season noted in the header. The black lines represent the PDFs under the influence of all external forcings and the grey lines the PDFs under the influence of natural forcings only. Solid lines (filled distributions) correspond to HadGEM1 and dashed lines to MIROC. The y-axis shows the normalised likelihood. The ALL curve is used to estimate the probability P1 and the NAT curve the probablility P0 in the definition of the FAR (Equation 2)

The distributions of the annual values of the seasonal temperatures become the basis for the calculation of the distribution of the FAR (step 5 of the methodology in Section 2). It is reminded that the FAR measures the change in the likelihood of exceeding a threshold as a result of human influences on the climate. First, we calculate 105 distributions of the FAR by setting the threshold to the observed seasonal temperature for each year in period 1904–2008 and using the PDFs of the temperature in the actual and natural world that correspond to the decade in which the year lies. The resulting time series of the FAR are plotted in Figure 5. Positive values suggest that seasonal temperatures above the ones observed are more likely under the influence of anthropogenic forcings. For example, a value of 0.5 means that the likelihood of exceeding the observed seasonal temperature has doubled under the effect of human influences. The FAR has markedly increased since the 1980s in all seasons and has now reached values close to unity, which means that the recent seasonal temperatures are only expected to occur under the influence of anthropogenic forcings. Although the two models are in good agreement after the 1980s, there are big discrepancies in earlier periods, and years with a negative FAR are also seen, indicating that natural forcings may have exerted a significant influence on seasonal temperatures in the past. However, the signal in earlier periods is small and generally within the range of internal variability (Figure 3). The sign of the FAR is determined by the relative strength of the attributable components which, however, are generally small in earlier years. For example, HadGEM1 produces a greater natural than anthropogenic response in JJA during 1920–1940 which is not the case for MIROC (Figure 3) and, hence, the two models yield FAR values of the opposite sign (Figure 5). We therefore conclude that more reliable and least model-dependent estimates of the FAR are obtained when there is good separation between the temperature distributions of the actual and natural world as well as a significant signal, as is the case in the last few decades. Figure 5 illustrates that the confidence in the estimated FAR increases as the dominance of internal variability in the climate response decreases. We cannot expect models to accurately measure the anthropogenic contribution to the observed temperatures in periods when the signal is largely obscured by natural variability. We can, however, expect them to provide useful indications in recent periods when the signal-to-noise ratio increases and the models provide a more consistent description of the climate response. Finally, one feature that appears only in the DJF time series of FAR is that the lower value is near zero throughout the period. This can be explained by the components of the response (Figure 3, top left panel), which show a thin strip of overlapping values that spans all the years. In this overlap region the distributions for the actual and natural world are expected to yield similar values of P0 and P1 which would correspond to a near-zero FAR.

Figure 5.

Change in likelihood of seasonal mean European temperatures exceeding the observed value for each year during 1904–2008. Each panel corresponds to the season noted in the header. Results are shown from analyses with HadGEM1 (in blue) and MIROC (in green). The lines correspond to the best estimate of the FAR (50th percentile of the distribution) and the light-coloured shaded areas mark the 5th and 95th percentiles of the distribution. The FAR time series were smoothed for clarity using 5-year running means. As the warming begins to rise above internal variability in the 1980s, the FAR estimates become more reliable (to the right of the dotted vertical line marked on the panels). This figure is available in colour online at wileyonlinelibrary.com/journal/joc

We now focus on the last decade of the reference period (1999–2008) and compute the distribution of the FAR over a range of thresholds from − 3 to + 3 K. The calculations are again based on the PDFs of annual values of the seasonal mean temperatures estimated with and without human influences, given the climate response attributed to the ALL and NAT forcings during 1999–2008. Figure 6 shows the resulting diagrams of the FAR. These can be used to assess the effect of human influence on European seasons as new observations of the seasonal mean temperature become available (assuming that the 1999–2008 climate response is still a good measure of the current climate change). In all seasons, the FAR tends to saturate to unity at thresholds above 2 K, though the uncertainty in the HadGEM1/DJF and MIROC/SON distributions is much greater. These larger uncertainties are consistent with the greater uncertainties in the attributable components of the response (Figure 3).

Figure 6.

Diagrams of the FAR representing the change in likelihood of seasonal mean European temperatures exceeding a threshold under the influence of anthropogenic forcings plotted over a range of thresholds. The thresholds are temperature anomalies relative to 1901–1930 averaged over Europe and the FAR is calculated based on the ALL and NAT components of the climate response during 1999–2008. Each panel corresponds to the season and model noted in the header. The best estimate (50th percentile) is plotted in blue for HadGEM1 (left panels) and in green for MIROC (right panels) and the 5–95% uncertainty range is represented by the light coloured area around the best estimate. An independent estimate of the FAR derived from the temperature reconstructions of Luterbacher et al. (2004) and Xoplaki et al. (2005) is also shown (black line). This figure is available in colour online at wileyonlinelibrary.com/journal/joc

Figure 6 also shows an independent estimate of the FAR which is based on European temperature reconstructions back to year1500 (Luterbacher et al., 2004; Xoplaki et al., 2005). The probability P0 for the natural world used to calculate the FAR (Equation (2)) is estimated from the annual values of the seasonal mean temperatures during period 1500–1800 when human influences had little impact on the climate. The 1500–1800 temperature time series are filtered to remove long timescale variations (scales greater than 10 years) and a normal distribution is applied to the data to compute the likelihood of threshold exceedence. It should be noted that in the calculation of the probability P0, we also include a pre-instrumental period (up to the seventeenth century) for which the variance may be underestimated. However, we did not find systematic shifts in the FAR when we tried using different parts of the dataset to calculate P0. The probability P1 for the actual world is estimated from the ten annual values of the seasonal mean temperatures during the most recent decade in the record using t-statistics. The use of only ten values in the computation of P1, and the fact that no extreme statistics are employed to examine thresholds at the tails of the distributions will affect the accuracy of the estimated FAR. However, this simple approach still gives a useful independent estimate that we can use to compare our results with. Despite these caveats, we find that the FAR from the temperature reconstructions is generally in good agreement with the FAR from analyses with the two GCMs. The FAR from the reconstructions rises slightly above the 95th percentile of the two-model estimates for MAM and only at higher thresholds for which the simple statistics used to estimate the FAR from the reconstructions may not be adequate. The same is the case for JJA and SON, but only for HadGEM1.

We also compute simple estimates of the FAR over a range of thresholds from a number of GCM runs in a similar way to the temperature reconstructions. We use an ensemble of opportunity from the World Climate Research Programme's (WCRP's) Coupled Model Intercomparison Project phase 3 (CMIP3) multi-model dataset that was used extensively in the IPCC AR4. We employ only those simulations of the twentieth century (20C3M) that are also extended into the twentyfirst century with the A1B scenario. There is further selection by only including: 1) simulations with both anthropogenic and natural forcing agents; and 2) simulations from individual model ensembles that have initial conditions at least 25 years apart. This provides, in total, 25 simulations created by 11 models from 8 institutions. The post-1980s trend estimated from these simulations (mean ± standard deviation) is 0.43 ± 0.31, 0.36 ± 0.24, 0.37 ± 0.18, and 0.40 ± 0.19 K/decade for DJF, MAM, JJA, and SON, respectively. The (ensemble mean) trends from HadGEM1 for the 4 seasons are 0.48, 0.47, 0.43, and 0.48 K/decade, and the trends from MIROC are 0.31, 0.27, 0.33, and 0.28 K/decade. Figure 7 shows the FAR estimated from the individual runs and for the ensemble of all the runs together. The probability P0 is computed from the seasonal temperature values for each year in the first decade of the twentieth century, while P1 comes from years in the decade ending in 2008. We use the first decade in the time series to estimate P0 assuming that the human influences are not as prominent as in the most recent years. We use t-statistics to get estimates of both P0 and P1 for the calculation of the FAR for individual runs (thin grey lines in Figure 7). The FAR for the ensemble of all the runs (thick dotted line in Figure 7) is estimated by combining the years from all the models for each of the two decades to compute P0 and P1. In this case, we use a normal distribution rather than t-statistics as we have a larger sample of seasonal temperatures (25 models × 10 years). The results for the whole ensemble agree well with those for the temperature reconstructions (thick solid line in Figure 7) and also with the HadGEM1 and MIROC results illustrated in Figure 6. The range of results shown in Figure 7 for different model runs comes from differences of the relative strength of the ANTHRO and NAT signals simulated by models with different climate sensitivities. Were these responses to be constrained by the observations (e.g. using optimal fingerprinting), the range in the FAR estimates would be expected to decrease. Some individual runs show a decrease in the FAR at higher thresholds. These are runs that produce less anthropogenic warming in recent decades, which means that the temperature distributions for the actual and natural world have a greater overlap. In these cases, the probabilities P0 and P1 have similar values which are also small at higher thresholds and simple t-statistics are inadequate to distinguish between the two distributions near the tails and to provide reliable estimates of the probabilities. The JJA results show values greater than 0.8, and an uncertainty range that is smaller than the FAR estimate of Stott et al. (2004) for the 2003 heat wave. There are a number of differences that can account for our higher FAR estimates, including the different models employed in the two studies, and also the different area and analysis period considered. The scaling factors of Stott et al. (2004) have a larger uncertainty than ours, which is expected to lead to a greater uncertainty in the FAR. The relatively larger uncertainty could stem from model differences, but may also be explained by the inclusion of more recent years in our analysis characterised by a prominent anthropogenic warming.

Figure 7.

Diagrams of the FAR representing the change in likelihood of seasonal mean European temperatures exceeding a threshold under the influence of anthropogenic forcings plotted over a range of thresholds. The thresholds are temperature anomalies relative to 1901–1930 averaged over Europe. The FAR is calculated based on the ALL and NAT components of the climate response during 1999–2008 (actual world in recent years) and during 1900–1909 (natural world). The thin grey lines correspond to 25 GCM runs that simulate the effect of all forcings in the twentieth century and follow the A1B emissions scenario after year 2000. The lines that bend down at high thresholds correspond to GCM runs for which the statistics are inadequate to provide accurate estimates of the probabilities needed for the estimation of the FAR. The thick dotted line is the FAR estimate from all the runs together. The thick solid line represents an independent estimate of the FAR derived from the temperature reconstructions of Luterbacher et al. (2004) and Xoplaki et al. (2005)

Finally, we use the diagrams shown in Figure 6 to estimate the temperature threshold at which anthropogenic forcings double, treble, or quadruple the likelihood of a season being at least as warm. These thresholds correspond to FAR values of 0.5, 0.67, and 0.75, respectively (Equation (2)). Table I gives the thresholds computed from the analyses with HadGEM1 and MIROC as well as the estimate from the temperature reconstructions and the ensemble of the 25 AR4 runs. The thresholds are estimated for period 1999–2008. The results from the AR4 ensemble in Table I agree with the temperature reconstructions within 0.35 K. For the model analyses with HadGEM1 and MIROC we examine the most conservative estimates and report the thresholds that correspond to the 95th percentile of the uncertainty range for the specific FAR value, i.e. right border of the light coloured area in Figure 6 (the 5th percentile values are also reported for comparison, but in this discussion we keep the focus on the 95th percentile only). The use of the 95th percentile means that the factor increase in the likelihood occurs at higher temperatures than those estimated using the observations. Apart from winter, both models agree that the likelihood of a seasonal anomaly greater than about 1K has at least doubled as a result of human influences on the climate, whereas for anomalies greater than about 1.5 K the likelihood has at least tripled. Internal climate variability has a more pronounced effect in winter, and in the HadGEM1 analysis it is anomalies of 3–4 K that correspond to a doubling in the likelihood. MIROC gives smaller thresholds for winter that are more similar to the other seasons. The use of more models would help provide a better representation of the uncertainty in these threshold estimates and needs to be considered in future work.

Table I. Estimates of seasonal temperature anomalies relative to 1901–1930 that correspond to FAR values of 0.5, 0.67, and 0.75. The FAR values refer to the 1999–2008 period. Results are calculated from analyses with HadGEM1 and MIROC and from the temperature reconstructions of Luterbacher et al. (2004) and Xoplaki et al. (2005). The model estimates correspond to the 95th percentile of the distribution of thresholds for the specified FAR value (the 5th percentile values are also given in brackets)
 T—ReconstructionsAR4 EnsembleHadGEM1MIROC
DJF    
FAR = 0.50.160.083.38 (−0.58)0.69 (−0.46)
FAR = 0.670.770.664.33 (−0.11)1.46 (0.02)
FAR = 0.751.181.004.79 (0.15)1.86 (0.29)
MAM    
FAR = 0.50.010.060.36 (−0.05)0.55 (0.03)
FAR = 0.670.210.430.73 (0.23)1.00 (0.32)
FAR = 0.750.330.650.96 (0.40)1.28 (0.49)
JJA    
FAR = 0.50.05− 0.060.63 (0.03)0.32 (−0.16)
FAR = 0.670.200.170.85 (0.21)0.52 (0.02)
FAR = 0.750.290.300.98 (0.32)0.63 (0.11)
SON    
FAR = 0.50.08− 0.020.85 (0.04)1.13 (−0.67)
FAR = 0.670.260.261.27 (0.29)1.61 (−0.45)
FAR = 0.750.370.421.54 (0.41)1.91 (−0.32)

4. Conclusion

This paper provides important new insight into the attribution of regional climate change. Using optimal fingerprinting, we have shown that the recent warming in European seasons is detectable, it cannot be explained solely in terms of internal climate variability, but is mainly driven by anthropogenic forcings. While the effect of human influence on European summer temperatures was also established in previous work (Stott et al., 2004; Jones et al., 2008), this is the first formal attribution study of changes in the other three seasons. The effect of natural forcings on winter and spring temperatures cannot be detected with the two models considered here. Detection of the natural signal in summer and autumn is less robust and more model-dependent than the detection of the anthropogenic signal.

When seasons with exceptionally warm temperature anomalies occur, the question arises whether this is linked to anthropogenic climate change. Our work provides a simple way to assess the human contribution based on pre-computed tables or diagrams of the FAR that span a wide range of temperature thresholds. By looking up the best estimate of the FAR together with the associated uncertainty that correspond to the observed temperature in these diagrams at the end of each season, we can provide a useful measure of the anthropogenic impact to policy makers and the public in a timely manner. Of course, this assessment can only be used for a few years after the end of the reference period (in this case, 2008), and the diagrams will regularly be updated in future, as climate change continues to modify the distribution of the European temperatures. Our methodology may also be extended to examine not only current, but also future distributions of the seasonal temperatures with and without the effect of anthropogenic forcings. This can be done by applying constraints from past observations to the simulated climate response of future periods as already demonstrated in previous work (e.g. Stott et al., 2006b).

This paper contributes to the ongoing research on regional attribution of climate change and extremes. This can be valuable for adaptation planning to enable better informed decisions on current weather-related risks and how they are changing. Following up on this work and considering issues that still remain unclear, we identify some areas where more work needs to be done. First, the uncertainty due to model dependency needs to be further investigated. Our analysis is limited to the use of only two GCMs as no other models have provided both ALL and ANTHRO experiments with ensembles of runs extended to present day. Although the prominent role of the anthropogenic forcings in the warming of the European seasons is clear in the analyses with both models and seems to be well established, validating our results with more GCMs would provide a better representation of the uncertainty in the FAR. It is also important to consider more regions in attribution studies, especially those regions with low adaptive capacity to climate change, for which little observational data are usually available (e.g. Africa). The methodology presented here can also easily be extended to study anomalously cold seasons, which are also very important in terms of their associated impacts. In the case of cold seasons, we would need to measure the change in the likelihood of falling below a cold temperature threshold with and without the influence of anthropogenic forcings. Finally, as local extreme events are often accompanied with the most catastrophic impacts, it is important that attribution studies put more emphasis on sub-continental and local scales. The international science community has recently begun to coordinate efforts to develop a new methodology for the attribution of individual high-impact extreme events based on large ensembles of GCM runs with prescribed sea surface temperatures (Stott and Trenberth, 2009). Apart from event attribution, this will also enable comparison between FAR estimates derived with the new approach and with coupled-model analyses like ours.

Acknowledgements

The authors are grateful to two anonymous reviewers for their constructive comments. NC, PAS, and GSJ were supported by the Joint DECC and Defra Integrated Climate Programme - DECC/Defra (GA01101). JL acknowledges support from the EU/FP7 project ACQWA (Assessing Climate Impacts on the Quantity and Quality of Water; http://www.acqwa.ch/;grant212250), from the DFG Project PRIME (PRecipitation In the past Millennium in Europe) within the Priority Program ‘INTERDYNAMIK’ and MedClivar (http://www.medclivar.eu/).

Ancillary