Observed changes in false springs over the contiguous United States

Authors


Abstract

Climate warming fosters an earlier spring green-up that may bring potential benefits to agricultural systems. However, advances in green-up timing may leave early stage vegetation growth vulnerable to cold damage when hard freezes follow green-up resulting in a false spring. Spatiotemporal patterns of green-up dates, last spring freezes, and false springs were examined across the contiguous United States from 1920 to 2013. Results indicate widespread earlier green-up and last spring freeze dates over the period. Observed changes in these dates were asymmetric with the last spring freeze date advancing to earlier in the year relative to green-up date. Although regionally variable, these changes resulted in a reduction in false springs, notably over the past 20 years, except across the intermountain western United States where the advance in green-up timing outpaced that of the last spring freeze. A sensitivity experiment shows that observed decreases in false springs are consistent with a warming climate.

1 Introduction

Many plant and animal species utilize warming temperatures and lengthening daylight during the spring season as cues to begin bioproductive development, or green-up (GU), observable through budburst, leafing, and flowering of vegetation. Plant GU marks an important transition from winter dormancy to photosynthesis. However, as plants undergo GU they develop and expose sensitive tissue to variable atmospheric conditions, including “hard” freeze events that can lead to cold damage or mortality. False springs, identified by hard freezes following GU, have widespread, albeit heterogeneous impacts on terrestrial ecosystems and agricultural systems at community, regional, and continental scales [Hufkens et al., 2012]. Premature GU associated with record warmth across the eastern United States during March 2012 followed by a series of freeze events resulted in more than half a billion U.S. dollars of damage to agricultural and horticultural plants in Michigan alone [Knudson, 2012]. A similar incident occurred in the southeastern U.S. in 2007 resulting in economic losses of 2 billion U.S. dollars and numerous short- and long-term ecologic impacts, including altered plant community structures, surface energy budgets and hydrologic cycles, and overall reduced net ecosystem productivity and carbon sequestration [Gu et al., 2008].

Though isolated studies have analyzed climatic and physiological aspects of false springs, a geographic analysis of their occurrence and observed changes over the historic record are lacking. Numerous studies show plant GU advancing coincident with observed increases in Northern Hemisphere spring temperatures over the past half century [e.g., Schwartz et al., 2006], albeit with regional deviations from this pattern (e.g., southeastern U.S.) [Marino et al., 2011]. Earlier GU enables a potentially longer growing season and increased carbon sequestration through photosynthesis contingent on water availability; such changes may be particularly advantageous to certain species and agricultural productivity, particularly in energy-limited regions. Likewise, studies have shown an advancement of last spring freeze (LSF) date [e.g., Schwartz et al., 2012]; however, warming temperatures accompanied by changes in daily temperature variance [e.g., Rigby and Porporato, 2008] are hypothesized to change the timing of GU and LSF dates, potentially by different magnitudes [Scheifinger et al., 2003]. An asymmetric change in the timing of GU and LSF dates may increase false spring occurrences as plants acclimate to earlier warmer temperatures, exposing sensitive growth tissue to high-amplitude fluctuations in temperature. To further address this hypothesis, this study asks whether false springs have changed in the continental United States from 1920 to 2013 and the degree to which observed changes are consistent with a warming climate.

2 Methods

Plant GU and LSF dates are estimated using daily maximum and minimum temperature observations for 1218 United States Historical Climatology Network stations from 1920 to 2013 obtained from the Global Historical Climatology Network database [Menne et al., 2012]. We restrict our analysis to stations with at least 85% data completeness over the period of record, stations with at least 85% of daily observations from January to June for any given year, and climatological LSF dates (1971–2000) before 1 June leaving a total of 882 stations. Daily vapor pressure deficit is approximated following Jensen et al. [1990] using observed daily maximum and minimum temperatures and estimating daily mean dew point temperature using daily minimum temperature from United States Historical Climatology Network stations and monthly dew point depression interpolated from Parameter-elevation Regression on Independent Slopes Model [Daly et al., 2008] data.

Numerous empirical phenological models exist that use widely available meteorological data and photoperiod [Migliavacca et al., 2012]. We use the meteorological-based Growing Season Index (GSI) [Jolly et al., 2005] to estimate plant GU. The GSI is a highly generalizable model not parameterized for a specific species by integrating the weighed effects of daily minimum temperature, photoperiod, and vapor pressure deficit to produce an index indicating foliar canopy development or continuance as confined by climatic limits. Jolly et al. [2005] found that GSI corresponded with the Normalized Difference Vegetation Index across diverse ecosystems globally thereby making it ideal for large spatiotemporal analyses. GSI is normalized between 0 and 1 by dividing local GSI by its 95th percentile (M. Jolly, personal communication, 2011), and GU is defined as the first calendar day of the year when the 21 day moving average first exceeds 0.5. We also considered the first leaf date from the Extended Spring Index (SI-x) model [Schwartz et al., 2012]; however, noting qualitatively similar results of observed and modeled changes, the GU analysis hereafter uses GSI with results for SI-x provided as supporting information.

The last calendar day prior to 1 July with minimum temperatures below −2.2°C (28°F) is used to qualify LSF date [Schwartz et al., 2006]. Though temperature thresholds that induce cold damage vary by species and phenostage, this criterion represents a hard freeze known to cause potential mortality or severe tissue damage and allows us to model interspecies thresholds.

We define a False Spring Exposure Index (FSEI) as the percent of years where LSF occurred post-GU. FSEI is an oversimplification as physiological vulnerability to cold temperatures is a function of phenostage and varies geographically, across species, and between species individuals. As reported by Augspurger [2013], vegetation is most susceptible to cold damage during later phenostages, though we lack a precise definition of both vulnerable and frost-resistant phenostages. To address this biological ambiguity we employed 0, 7, 10, and 15 day lags between LSF and GU as a basis for calculating FSEI. We found the results not sensitive to choice of time window past 7 days; for clarity, we relegate the 0, 10, and 15 day lag FSEI figures to the supporting information (Figure S1 and Table S1) and focus solely on the 7 day lag FSEI. The FSEI defines false springs as a binary event, indicating exposure to potential damaging cold temperatures but not actual likelihood or vulnerability of damage or mortality, thereby making the index broadly applicable across species and large spatiotemporal scales. The event-driven nature of our definition of FSEI makes it unlike previous false spring models, such as the damage index value presented in Schwartz et al. [2006] that reports the number of days between GU and LSF.

Linear trends in GU and LSF dates are estimated using linear least squares regression, and statistical significance (p < 0.05) is determined by computing the standard error of the trend adjusted for serial autocorrelation. Due to the binary nature of the FSEI, we use a bootstrap resampling approach with replacement (n = 1000) to assess statistical significance of FSEI changes between the first half of the record (1920–1966) and the latter half of the record (1967–2013). Statistically significant differences between the two epochs then are qualified by at least 90% of the differences among the samples are of the same sign. Decadal anomalies in GU date, LSF date, and FSEI from the 1920–2013 base period are aggregated across all stations in six geographic regions composed of the northwest (NW), northeast (NE), Great Plains (GP), Midwest (MW), southwest (SW), and southeast (SE) following regions from the National Climate Assessment to better quantify regional responses (http://scenarios.globalchange.gov/regions, last accessed 22 March 2013).

Finally, a sensitivity analysis is used to better understand sources of observed trends. We consider whether the magnitude and geographic structure of observed trends are attributable to overall warming, changes in the submonthly variability of temperature, or some combination of the two. An observed increase in mean annual surface temperature across the contiguous U.S. between 1920 and 2013 of approximately 0.6°C is well documented, albeit geographically variable. By contrast, changes in higher-order statistical moments are less well documented; Shen et al. [2011] found slight decreases in daily temperature variance across the United States of up to 10%, most notably associated with decreases in cold extremes. Using 1920–1966 as the control period, we performed five experiments as follows: (A) +0.5°C delta increase to daily temperatures; (B1) +0.5°C delta increase to daily temperatures and 10% increase in intramonthly variability; (B2) 10% increase in intramonthly variability; (C1) +0.5°C delta increase to daily temperatures and 10% decrease in intramonthly variability; and (C2) 10% decrease in intramonthly temperature variability. Temperature variability was modified by extracting submonthly variability in minimum and maximum temperatures using a 31 day high-pass filter, modulating this signal by 10% before adding it back to the 31 day low-pass filter of the original data. Experiments are qualitatively compared to observed differences between the first and latter half of the 1920–2013 record.

3 Results

Climatological GU and LSF dates (1920–2013) exhibit strong latitudinal gradients with elevation and maritime proximity controls supporting previous findings [e.g., Schwartz et al., 2006] (Figures 1a and 1b). There is notable mismatch in timing between event climatologies, with GU typically found earlier in the year than LSF, most notably in the GP, MW, and NE potentially due to susceptibility to cold air outbreaks and large spring temperature fluctuations (Figure 1c); by contrast, false springs were more rare in regions with a significant maritime influence and more subtle intraseasonal temperature variability.

Figure 1.

(a) Day of year of green-up defined using the Growing Season Index of Jolly et al. [2005] averaged over 1920 to 2013. (b) Last day of year when temperatures reach or exceed −2.2°C, or the last spring freeze, averaged over 1920 to 2013. (c) False Spring Exposure Index or the percent of years over the 94 year period where the last spring freeze occurred at least 7 days post green-up date.

The contiguous U.S. saw widespread earlier GU and LSF dates from 1920 to 2013 (Figures 2a and 2b). However, we find differences in trends with LSF date on average advancing by 5.8 days over the record compared to an average advance by 1.8 days for GU date. Linear trends over the 94 year period mask some of the larger changes observed since 1950 due to relatively mild springs in the 1930s–1940s relative to the 1950s–1970s (Figure 3); truncating the period from 1950 to 2013 results in an average advance in GU and LSF dates by 4.2 and 7.9 days, respectively. This asynchronous advance in LSF dates relative to GU is hypothesized to decrease potential for cold damage or mortality [Schwartz et al., 2006]. LSF advancement was also more geographically extensive than GU, with a significant advancement in LSF date for 29% of stations, compared to 15% of stations for earlier GU. By contrast, 4% and 5% of stations reported a significant regression in LSF and GU dates, respectively. These asymmetric changes in LSF and GU dates resulted in a 2.9% reduction (44.2% to 41.5%) in false springs between 1920–1966 and 1967–2013. A similar reduction in FSEI in the latter half of the record when only data from 1951 to 2013 is considered; notable, however, is increased false spring exposure in the intermountain west during this period (Figure 2c).

Figure 2.

Linear least squares trend in the timing of (a) green-up and (b) last spring freeze for the period 1920–2013. Trend is reported in number days per 94 years, with significant trends represented by larger circles. (c) Change in False Spring Exposure Index between 1920–1966 and 1967–2013. Statistically significant changes are represented by larger circles.

Figure 3.

Regional decadal anomalies in green-up, last spring freeze, and False Spring Exposure Index from the 1920–2013 base period, 2000–2013 considered for hundreds. NW is northwest, NE is northeast, GP is Great Plains, MW is Midwest, SW is southwest, and SE is southeast. Plus signs represent green-up anomalies while diamonds represent last spring freeze anomalies, both reported in days. Bars represent percent departure in False Spring Exposure Index.

Distinct geographic patterns are readily apparent in Figure 2. Notably, the SE region experienced little change in GU timing (+0.7 days) (Figure 3), consistent with the nonconformity of temperature trends in the region. Conversely, LSF dates advanced 2.4 days over the period, resulting in a 2.6% regional decrease in FSEI. The NW region saw LSF and GU dates shift by an average of 7.8 days and 2.4 days earlier, respectively, resulting in the largest regional reduction in FSEI (−6.4%). The NE likewise experienced a decrease in FSEI (−1.7%), supporting Schwartz [1993] in his suggestion of decreased false springs over the latter part of the twentieth century in this region. Notably, decreases in false springs were observed across all geographic regions, with LSF advancing at a greater magnitude than GU. Though all regions had significant decadal variability in FSEI and GU and LSF timing, the last two decades exhibited countrywide synchrony with earlier LSF dates and reduced FSEI. Such synchrony does not appear to be consistent with modes of large-scale climate variability and associated quasi-stationary Rossby waves that often result in dipole patterns that traverse the continental United States and have noted impacts on the timing of spring [McCabe et al., 2011].

Sensitivity experiments show advances in the timing of GU and LSF dates with +0.5°C delta warming (Experiment A); however, with changes in LSF dates being typically 50% greater than changes in GU date (Figure 4). This asymmetric change results in reduced FSEI by approximately 3.3%, similar to observed changes. Similarly, changes in intramonthly temperature variance had the most pronounced impact on LSF dates with increased variability delaying the timing of LSF date and increasing FSEI whereas decreased variability acutely advanced the timing of LSF date and decreased FSEI. Comparisons to observed change are most qualitatively consistent with simple warming; however, it should be noted that observed regional differences in warming varied by region making a direct comparison problematic. Patterns of change in GU date with Experiment A show a mainly homogenous change in contrast to changes in LSF date (Figures S3a and S3b). The relative advancement in LSF to GU date in this experiment was particularly acute in the maritime climates along the Pacific coast as well as much of the southeastern U.S. corresponding to regions with relative low-intraseasonal temperature variability during spring (Figure S3d). Continental regions across the GP and MW exhibit a subdued response due to higher daily temperature variance during the spring consistent with continental climates. These changes were magnified when applying a +2°C incremental warming suggesting that they may scale with the amount of warming applied, though nonlinearly beyond some threshold.

Figure 4.

Regional breakdown of sensitivity experiment results; hollow bar represents difference between observed and modeled green-up, and the shaded bar represents difference between observed and modeled last spring freeze, both corresponding to the bottom axes. Inverted triangles represent difference in False Spring Exposure Index between control and experiment, and correspond to the top axes.

4 Conclusion

Recent studies focusing on false springs have highlighted regional events following exceptional early season warmth [e.g., Gu et al., 2008; Ault et al., 2013] in addition to increased spring cold vulnerability at individual locations [e.g., Augspurger, 2013]. To address larger spatiotemporal changes, we examined false springs across the continental U.S. over the 1920–2013 period. Results indicated false spring exposure decreased across the U.S. over the 94 year record, consistent with the asymmetric advance in LSF date relative to GU date. Furthermore, the results from the simple warming sensitivity experiment are consistent with observed changes. However, we recognize the limitations of this analysis; namely, the phenological models and cold damage thresholds we used are generalized and may be limited in their modeling capabilities and fail to reflect more complex ecophysiological factors. Likewise, the FSEI is a simplification of false spring exposure, as it is difficult to climatologically define cold damage exposure across individuals, communities, and regions. We found that decreases in FSEI were more pronounced when defined using a 15 day period separating GU and LSF (Table S1), suggesting nonlinearities in the index itself. Advancements in quantifying cold damage resulting from false springs across landscape scales are needed for better quantifying cold damage vulnerability and its impacts on phenology in land surface modeling and the terrestrial carbon cycle [Jeong et al., 2012].

Plant communities and agriculture will likely take advantage of earlier warm temperatures, and decreased false spring exposure may enhance ecosystem carbon sequestration across the continental U.S. [Hufkens et al., 2012]. Whereas our sensitivity analysis suggests continued decreases in false springs with further warming, projected changes in climate may feature higher-order changes with more pronounced increases in temperature for cold extremes that may further projected decreases in false spring occurrence. It is unlikely that false springs will disappear entirely, and their impacts have been shown to be ecologically and economically damaging on local and regional scales [e.g., Ault et al., 2013]. Continued climate warming necessitates further understanding of not only projected changes in GU and LSF timing but also the collective impact that false springs have on the terrestrial environment.

Acknowledgments

We would like to acknowledge the constructive feedback from two anonymous reviewers that helped improve our analysis and the quality of our paper. This research was supported by the National Institute for Food and Agriculture competitive grant, award: 2011-68002-30191 and the NSF Idaho EPSCoR Program and by the National Science Foundation under award EPS-0814387.

The Editor thanks two anonymous reviewers for assistance in evaluating this paper.

Ancillary