Changes in North American extremes derived from daily weather data

Authors


Abstract

[1] Detailed homogeneity assessments of daily weather observing station data from Canada, the United States, and Mexico enabled analysis of changes in North American extremes starting in 1950. The approach used a number of indices derived from the daily data, primarily based on the number of days per year that temperature or precipitation observations were above or below percentile thresholds. Station level indices were gridded to produce North American area-averaged time series. The results indicated that the increase in the number of days exceeding the 90th percentile is about the same magnitude as the decrease in the number of days below the 10th percentile. Analysis of extremes farther out on the tails of the distribution (e.g., 95th and 97.5th percentiles) reveals changes very similar to the 90th and 10th percentiles. Annual extreme lowest temperatures are warming faster than annual extreme highest temperatures when the index assessed is the actual temperature, but cold and hot extremes are changing about the same when examined on a normalized basis. On the basis of several measures, heavy precipitation has been increasing over the last half century, and the average amount of precipitation falling on days with precipitation has also been increasing. These observed changes since the late 1960s, decrease in cold extremes, increases in warm extremes, and increases in heavy precipitation, are consistent with a warming planet.

1. Introduction

[2] Most societal infrastructures, as well as natural and agricultural plant and animal communities, have adapted to historical extremes. But changes in the tails of the distribution of daily data may not be easily accommodated because weather extremes often have very direct and nonlinear impacts. For example, extremely hot weather can cause railroad tracks to buckle [Peterson et al., 2008] and adverse human health effects involving thousands of fatalities even in developed countries [Karl and Knight, 1997; Schär and Jendritzky, 2004; Milligan, 2005]. Overwintering insects must avoid death from freezing tissues which occur at different temperatures for different insects [Turnock and Fields, 2005]. The lack of temperatures at or below −40°C in recent years in British Columbia has contributed to an explosion of mountain pine beetles populations in millions of hectares of forest [British Columbia Ministry of Forests and Range, 2006]. Highway underpasses can safely accommodate moderate precipitation, but heavy precipitation may cause flooding. Heavy rain also causes much more severe erosion than moderate rainfall and can lead to increased cases of diarrhea even in developed countries like the United States due to flood induced overflow of sewage [Cazelles and Hales, 2006]. In sum, as these examples indicate, extreme weather has profound effects on human and natural systems.

[3] The climate changes on many different spatial and temporal scales. To understand the impact of large-scale forcings, large areas must be examined in order to dampen the very small scale noise of individual observation locations and to reveal the large-scale climate change signal. The North American continental-scale analysis is between global and local spatial scales. It is an area with good data coverage, averages out the effect of circulation changes that might make one side of the continent warm and the other cold, and it makes clear physical sense. To incorporate national boundaries, North America is defined as including Hawaii, Puerto Rico and the U.S. Virgin Islands.

[4] Subregions of North America have previously been analyzed using a variety of indicators or definitions of extremes, over varying periods of records, and with differing approaches to data homogeneity. The results vary by region, by period of time examined, by the measure of extreme assessed, and, quite likely, by homogeneity approach or lack there of.

[5] Across Canada, over the period 1950 to 2003 there were trends toward fewer cold days and cold nights and more warm days and warm nights [Vincent and Mekis, 2006]. For southern Canada over the longer period of 1900 to 1998, both the lower and higher percentiles of daily maximum and minimum temperatures experienced significant warming [Bonsal et al., 2001]. Both Canada and the United States saw high temperatures during the drought years of the 1930s [Bonsal et al., 2001; DeGaetano and Allen, 2002]. For the contiguous United States (CONUS) using percentiles defined monthly, the number of days exceeding the 95th percentile had increased since 1960 and the number of days below the 5th percentile decreased but the signs of the trends were the opposite when the analysis started in 1930 [DeGaetano and Allen, 2002]. This is especially true of the 95th percentile of maximum temperature which had very high values during the drought years of the 1930s and 1950s. Examining trends from 1948 to 2000 for all the different percentiles of temperature calculated on a monthly basis for Alaska, Canada and the CONUS, more warming was found in winter than summer with all the different percentiles having similar trends [Robeson, 2004].

[6] Since 1950, heavy and very heavy precipitation in Alaska south of 62°N has been increasing [Groisman et al., 2005]. Across Canada, decadal variability is the dominant feature in both the frequency and intensity of extreme precipitation events over the period 1900 to 1998 [Zhang et al., 2001]. There is no consistent change in extreme precipitation indices in Canada over the periods 1900–2003 and 1950–2003, with the exception of increases in days with precipitation equal to or greater than 10 mm, which could be considered heavy precipitation in some regions and moderate in others [Vincent and Mekis, 2006]. However, increases in heavy precipitation for the CONUS were found using data starting in 1910 [Karl and Knight, 1998]. Multiday precipitation events with recurrence intervals of at least 1 year showed an increase over the CONUS but the time series exhibited strong low-frequency variability with low values during the 1930s and 1950s, and above average number of events in the 1940s, early 1980s and 1990s [Kunkel et al., 1999]. In Mexico's northern Baja California, there have been increases in heavy wintertime precipitation since 1977 [Cavazos and Rivas, 2004]. While heavy summer precipitation has not changed significantly over the last 30 years in central Mexico, very heavy (99th percentile) precipitation has increased [Groisman et al., 2005].

2. Historical Daily Data

[7] Daily maximum temperature (Tmax), minimum temperature (Tmin) and precipitation were analyzed for the stations indicated in the map in Figure 1 (top). Figure 1 (bottom) shows how the number of stations with data complete enough to calculate indices changed over the analysis period. As the number of stations available drops off at both ends of the time period, analyses using a subset of stations with data available for the whole period were compared with analyses using all the stations. For temperature indices, the North American area-averaged time series from the subset were very similar to results of the full data set. For precipitation indices, while the individual annual data points did differ, the overall appearance of the annual time series and their long-term changes were similar. These results indicate that the change in data availability did not artificially bias the climate change results.

Figure 1.

(top) Locations of meteorological stations whose observations were used in the analyses. (bottom) Number of temperature observing (solid) and precipitation observing (dashed) stations with data complete enough to produce indices as a function of year.

2.1. Canadian Data

[8] Canadian temperature data consist of homogeneity adjusted daily minimum and maximum values for 210 high quality (i.e., few missing values, minimal urban effects), relatively evenly distributed stations across the country. For these data, homogeneity problems caused by station relocation and changes in instrumentation and observing practices were addressed using a regression technique and surrounding stations [Vincent, 1998]. Adjustment factors for monthly temperatures were computed for identified inhomogeneities, and further interpolated into daily adjustment factors that were used to produce adjusted daily temperatures [Vincent et al., 2002]. This data set has been used in previous studies on changes in Canadian temperature extremes [Vincent and Mekis, 2006; Bonsal et al., 2001].

[9] Canadian precipitation data include adjusted daily rainfall and snowfall amounts observed at 495 stations across the country [Mekis and Hogg, 1999] (updated). All known inhomogeneities in the station data caused by changes in the measurement programs were carefully minimized. Wind undercatch, wetting loss, evaporation, trace events, and varying snow densities were also considered in the adjustment procedure. Inhomogeneity due to station relocation was not, however, addressed. A subset of this data set was used to investigate changes in heavy precipitation events in Canada [Zhang et al., 2001] and trends in precipitation intensity in Canada [Vincent and Mekis, 2006].

2.2. U.S. Data

[10] For the United States and Mexico, homogeneity adjusted daily data sets are not yet available. Instead great care was taken to identify and remove from the analysis any station time series with discontinuities. The U.S. daily data were extracted from the Global Historical Climatology Network (GHCN) Daily data set (http://www1.ncdc.noaa.gov/pub/data/ghcn/daily/). The U.S. subset of GHCN-Daily is from the National Weather Service Cooperative and First-Order weather observing station data which have undergone quality control at NOAA's National Climatic Data Center. GHCN-Daily provides a few additional quality control checks. Unfortunately, data from many of these stations are inhomogeneous because of changes in observing location, time of day the observations were made, etc.

[11] Several steps were taken to limit the impact of inhomogeneities on the analyses. First, no data prior to 1950 were used as longer time series have greater chance of containing artificial discontinuities. Stations with less than 20 years of data were removed from the analysis as well. Then the data for the CONUS were subjected to statistical tests [Menne and Williams, 2005] that compared station temperature time series with those of neighboring stations. Stations with statistically significant change points in their temperature time series were removed from the analysis. This reduced the 12,581 possible CONUS stations in GHCN-Daily to 2606.

[12] Stations in Alaska, Hawaii, Puerto Rico and the U.S. Virgin Islands are often too far apart for the neighboring stations statistical analysis [Menne and Williams, 2005] to work reliably. Each station time series from these regions was individually appraised. After first removing stations with less than 20 years of data, plots of each time series were evaluated and stations with visible problems, such as large step changes in the time series or major periods with missing data, were removed. Then each station was subjected to a statistical homogeneity test (specifically, the RHTest which is a homogeneity test written in the statistical language “R” for use at regional climate change workshops [Wang, 2003; X. L. Wang and Y. Feng, RHTest user manual, 2004, available at http://cccma.seos.uvic.ca/ETCCDI/RHtest/RHtestV2_UserManual.doc]) which evaluated changes in station time series. While this test is good at determining changes in the characteristics of a time series and providing statistical significance to the detected changes, it does not provide guidance as to whether the change is due to changes in climate or changes in observing practices. Nevertheless, results of the RHTest were useful as a guide to an additional evaluation of time series which was conducted using graphs and assessment of station history information. Stations this assessment deemed inhomogeneous were removed from the analysis. The final step for Hawaii, Puerto Rico and the U.S. Virgin Islands data was to consult with the appropriate State Climatologists. On the basis of their advice, a few additional stations were removed from the analysis. This process of removing the most inhomogeneous stations from the analysis reduced the total Alaska, Hawaii, Puerto Rico and the U.S. Virgin Islands station count from 745 to 55, some of which only had precipitation data.

2.3. Mexican Data

[13] A subset of the longest and most continuously operating 163 temperature and precipitation stations were selected from data provided by the Servicio Meteorológico Nacional (SMN) of Comisión Nacional del Agua (CNA). These stations cover Mexico north of 20°N and their data mainly extend over the second half of 20th century. Additional quality control and homogeneity assessments of the daily data were undertaken using the regional climate change workshop software RHTest and RClimDex (X. Zhang and F. Yang, RClimDex (1.0) user manual, 2004, available at http://cccma.seos.uvic.ca/ETCCDI/software.shtml). The QC consisted of preliminary checks for identifying logical errors (e.g., Tmax less than Tmin, precipitation less than 0), potential wrong data defined as values exceeding a certain threshold were flagged, and visual inspection of the plotted Tmax, Tmin and precipitation time series was done. Additionally, data points that were outliers from a time series perspective that were over 4 standard deviations (σ) for temperature and 6σ for precipitation were also flagged as potential errors.

[14] The potential errors were then validated or rejected (i.e., set to missing) by consulting (1) the original records; (2) independent sources, such as the Mexican Cold Front Index [Magaña and Vázquez, 2000] for temperature and precipitation records or the hurricane tracks record available at National Hurricane Center for heavy precipitation events; (3) the values of adjacent days at the same station; (4) data from the same date at nearby stations; and (5) outgoing long-wave radiation anomalies and synoptic patterns based on reanalysis data [Kalnay et al., 1996] as well as known impacts of ENSO on precipitation [Magaña et al., 2003]. Only those values that were confirmed to be erroneous were set to missing and deleted from further analysis. From the 163 temperature and precipitation records, the highest quality and most homogeneous 31 daily maximum and minimum temperature time series and 56 daily precipitation time series were selected.

3. Indices and Analysis Techniques

[15] The set of indices used in this analysis are based on the 27 indices from daily data formulated and internationally coordinated by the joint World Meteorological Organization (WMO) Commission for Climatology (CCl)/World Climate Research Programme (WCRP) project on Climate Variability and Predictability (CLIVAR)/Joint WMO–Intergovernmental Oceanographic Commission of the United National Educational, Scientific and Cultural Organization (UNESCO) Technical Commission for Oceanography and Marine Meteorology (JCOMM) Expert Team on Climate Change Detection and Indices (ETCCDI). This suite of indices, available from http://cccma.seos.uvic.ca/ETCCDI/, has changed since it was first used [Frich et al., 2002; Peterson et al., 2002]. Some indices have been added but more importantly the approach used to determine station level thresholds for percentile indices has been improved.

[16] The original approach calculated, for example, the 10th percentile of daily Tmax by determining the value of the 10th percentile of the data from a 5-d window centered on each calendar day during a base period, in this case, 1961–1990. This calendar day-specific value was used for that calendar day throughout the entire time series. However, it was later determined that this approach caused a slight discontinuity in the indices at the beginning and end of the base period. The solution was a bootstrap procedure [Zhang et al., 2005a] that used that same technique for determining the appropriate threshold value for years outside the base period but for years inside the base period only used the other 29 years to calculate the appropriate threshold. This changes the percentile thresholds slightly from year to year but more importantly it insures that the percentile thresholds used for any particular year were not calculated using data from that year. Also, the estimation of threshold and hence the exceedance rate for percentile based temperature indices is influenced by observation precision. Many Mexican stations have a precision of 1°C, most U.S. stations have a precision of 1°F (0.56°C), and most Canadian stations have a precision of 0.5°C. Temperature readings with such precision result in exceedance rate far away from the nominal value. Therefore, a small random number is added to the daily temperature data such that precision of temperature data are adjusted to 0.1°C for all stations to avoid this problem.

[17] Over the last several years, this suite of indices was used to examine changes in extremes in five areas of the world where regional climate change workshops were held [Aguilar et al., 2005; Haylock et al., 2006; Klein Tank et al., 2005; New et al., 2006; Vincent et al., 2005; Zhang et al., 2005b] and one global analysis which incorporated the indices calculated at the regional workshops [Alexander et al., 2006]. For this North American extremes analysis, a few additional indices were calculated in a manner consistent with formulations used by the ETCCDI.

[18] Indices of relevant parameters were created on a station basis and then averaged together. For North American time series, anomalies of station level indices were first averaged into 2.5° latitude by 2.5° longitude grid boxes. Where a grid box did not have any stations, the values of the indices from neighboring grid boxes were interpolated into that grid box in order to make the averaging area more spatially representative. This primarily occurred in northern areas of Canada. The grid box values were then averaged on an area-weighted basis to create North American time series. The time series figures show the annual values and a smoothed line derived from a locally weighted regression (lowess filter) [Cleveland et al., 1988]. An advantage of a lowess filter is that it is robust to isolated extreme values and therefore depicts the underlying long-term changes quite well.

[19] To determine whether the changes represented by the lowess filter are significant, two different tests were applied. The first test compared the last third of the North American time series time with the first third of the time series using a multiresponse permutation procedure (MRPP) assessment of the rank order of the data points [Mielke and Berry, 2001] to determine if on a broad basis whether any significant change has taken place making the latter part of the time series, for example, warmer than the first part. All the time series addressed in this paper have changes that are significant at the 1% level with three exceptions: the maximum 1 d and 5 d precipitation analyses are significant at the 5% level and the analysis of the 99th percentile of precipitation did not quite make the 5% level. The second approach used Kendall's tau (described below) to determine if a trend in the time series was significant, which is a somewhat different question. All North American averaged time series addressed in this paper had trends that were significant at the 5% level except maximum 1 d precipitation and precipitation above the 99th percentile.

[20] A map of an index shows grid box linear trends computed using a Kendall's tau based slope estimator [Sen, 1968]. This estimator is robust to the effect of outliers in the series. It has been widely used to compute trends in hydrometeorological series [e.g., Wang and Swail, 2001; Zhang et al., 2000]. The significance of the trend is determined using Kendall's test because this test does not assume an underlying probability distribution of the data series. There is, however, a problem associated with the Kendall test in that the result is affected by serial correlation of the series. Specifically, a positive autocorrelation in the residual time series (which is common in climatological data) will result in more false detection of a significant trend than specified by the significance level [e.g., von Storch, 1995; Zhang and Zwiers, 2004]. This would make the trends detection unreliable. Because of this, we use an iterative procedure to compute the trend and to test the trend significance taking account of a lag-1 autocorrelation effect [Wang and Swail, 2001; Zhang et al., 2000]. Throughout our paper, a trend is considered significant if it is statistically significant at the 5% level. Grid boxes where the trend is significant are highlighted.

[21] While the analysis used data from 1950 through 2004, not all grid boxes had the same period of record. A trend for a grid box was plotted in the figure if the grid box average time series was at least 40 years long. This low threshold allows the maps to be fairly complete and show the local trends in the indices where data are available. But it also means that some grid boxes are providing trends from different years and thereby enhances the appearance of spatial variability. Also, should a grid box have multidecadal variability, which is frequently seen in the North American time series, linear trends starting in 1950 may not represent the climate change over the last three decades very well.

4. Results

4.1. Similar Results Shown by Different Measures of Temperature Extremes

[22] The ETCCDI percentile indices of Tmax and Tmin were of the 10th and 90th percentiles. As each calendar day's threshold was determined separately, the probability of exceeding the 90th percentile is just as likely in winter as it is in summer. Examination of Figure 2 reveals that the number of days exceeding the 90th percentile of maximum and minimum temperature has been going up since the late 1960s. Similarly the number of days below the 10th percentile has been going down, again since the late 1960s with little change between 1950 and the late 1960s. Figure 3 shows the geographic distribution of the changes in days exceeding the 90th percentile of minimum temperature, which is most statistically significant in the west and north with only small areas of cooling in the northeast and southern parts of North America.

Figure 2.

Percent of days below the 10th (dashed) or above the 90th (solid) percentiles of daily maximum (red) or minimum temperature (blue). The heavy smoothed line is a lowess filter applied to the annual time series.

Figure 3.

Grid box (2.5° latitude by 2.5° longitude) level trends of days above the 90th percentile of minimum temperature. Grid boxes with trends that are statistically significant at the 5% level are outlined in green and have a green dot in their centers.

[23] The detection probability of trends depends on the return period of the event and the length of the observational series. For time series with a typical length of ∼50 years, the optimal return period for detection is 10–30 d [Frei and Schär, 2001; Klein Tank and Können, 2003]. It would be very difficult to detect a trend in a measure of extremes that occurs only a few times over the course of 50 years. While a threshold of extremes that on average is exceeded every 10 d facilitates detection of trends, such values are not the extremes that have the most significant impact.

[24] It was recommended at the July 2005 meeting on North American Weather and Climate Extremes: Progress in Monitoring and Research in Aspen that extremes farther out on the tail be analyzed as well. Therefore, analysis was performed on the 10th, 5th and 2.5th percentiles, not only to provide insights into changes of rarer events (e.g., the 2.5th percentile would be exceeded on average only nine times a year), but to see how changes in the data points farther out on the tails of the distribution compare to changes in the 10th percentile.

[25] Time series of North American area-averaged days exceeding the 90th, 95th and 97.5th percentiles of minimum temperature are shown in Figure 4 (top). The behavior of the 95th and 97.5th percentiles is similar to the 90th percentile but with smaller magnitude changes. To determine whether the difference in the magnitude of the change is due simply to having fewer total observations exceeding the higher thresholds, each of these North American area-averaged time series, the 90th, 95th and 97.5th, were normalized by dividing by the standard deviation of that time series. Examination of the normalized time series, shown in Figure 4 (bottom), reveals that the changes in the 90th, 95th and 97.5th percentile exceedances are almost identical.

Figure 4.

(top) Percent of days exceeding the 90th (blue), 95th (black) and 97.5th (red dashed line) percentiles minimum temperature. (bottom) Standardized anomalies of the time series in Figure 4 (top). The thick smoothed lines are from a lowess filter applied to the annual time series.

4.2. Different Changes in Warmest and Coldest Extremes

[26] The percentile indices presented in the previous section were determined throughout the calendar year and were based on the number of days exceeding a threshold. To examine how the highest and lowest temperatures of the year have been changing an index was used that assessed the actual temperature value. On a North American average, the highest summertime maximum and minimum temperatures experienced at weather observing stations has increased about 1°C since the mid-1960s (see Figure 5). By contrast, the lowest winter maximum and minimum temperatures have increased ∼3.5°C over the same time period.

Figure 5.

The highest (solid) maximum (red) and minimum (blue) temperatures observed in a year have increased ∼1°C since the mid-1960s while the coldest (dashed) maximum (red) and minimum (blue) temperatures observed in a year have increased ∼3.5°C since the late-1960s. The thick smoothed lines are from a lowess filter applied to the annual time series.

[27] Frich et al. [2002] found that an index based on difference between the highest summertime maximum temperature and the lowest wintertime minimum temperature provided a very robust metric despite being based on only two observations per year. Analysis of the annual highest and lowest temperature extremes, based on only one observation per year per station per index, also produces time series that do not appear to be artifacts of data quality problems. This is likely because each of the data sets has undergone extensive QC as described or referenced in section 2 and also because single data points are not just isolated data points in need of QC but are the tips of the tails of the whole distribution of data points that undergo QC.

[28] In the last few decades, the high latitudes of Northern Hemisphere continents have been warming more rapidly than the tropics. But the high latitudes also have much greater year to year variability than the tropics. As examination of Figure 5 reveals, North American winter extremes also have much more year to year variability than summer extremes. Interestingly, however, on a normalized basis, winter extremes and summer extremes are changing quite similarly.

4.3. Increase in Heavy Precipitation

[29] Several indices have been developed to track changes in precipitation intensity, particularly heavy precipitation. Figure 6 shows the Simple Daily Intensity Index, which is simply the total annual precipitation divided by the number of days with precipitation equal to or greater than 1 mm. This small threshold is designed to insure that changes in how an observing network treats trace precipitation does not impact the index. The increases in the Simple Daily Intensity Index shown in Figure 6 indicate that average daily precipitation is heavier. The spatial distribution of trends in this index is not as uniform as the changes in temperature shown earlier. The annual precipitation from days exceeding the 95th and 99th percentile of daily precipitation has been increasing on a North American area-averaged basis, as shown in Figure 7. The highest 1 d and 5 d precipitation events are also increasing (see Figure 8).

Figure 6.

(top) The Simple Daily Intensity Index has been increasing on a North American area-averaged basis. This index is simply the total annual precipitation divided by the number of days with precipitation. So increases in this index indicate that on days when precipitation does occur, it tends to be heavier. The thick smoothed line is a lowess filter applied to the annual time series. (bottom) Grid boxes with trends that are statistically significant at the 5% level are outlined in magenta and have a magenta dot in their centers.

Figure 7.

Annual precipitation from days with precipitation equal to or greater than the 95th (solid) and 99th (dashed) percentiles has been increasing. The thick smoothed lines are from a lowess filter applied to the annual time series.

Figure 8.

Maximum 1 d (solid) and 5 d (dashed) precipitation has been increasing. The thick smoothed lines are from a lowess filter applied to the annual time series.

5. Conclusions

[30] Detailed homogeneity assessments of daily weather station data from Canada, the United States and Mexico enabled analysis of changes in North American extremes starting in 1950. The measures of extremes assessed in this study were indices and variations of indices developed by the joint CCl/CLIVAR/JCOMM Expert Team on Climate Change Detection and Indices. The increase in the number of days exceeding the 90th percentile is about the same magnitude as the decrease in the number of days below the 10th percentile. Analysis of extremes farther out on the tails of the distribution (e.g., 95th and 97.5th percentiles) reveal very similar changes as the 90th and 10th percentiles. Annual extreme cold temperatures are warming faster than annual extreme hot temperatures when the parameter measured is the actual temperature but cold and hot extremes are changing about the same when examined on a normalized basis. On the basis of several measures, heavy precipitation has been increasing over the last half century and the average amount of precipitation falling on days with precipitation has also been increasing.

[31] The large domain dampens some of the decadal-scale variability found in some of the earlier studies of regions of North America allowing a robust climate change signal to be revealed. These observed changes since the late 1960s, indicating decreases in cold extremes, increases in warm extremes, and increases in heavy precipitation, are consistent with a warming planet. These changes in extremes are likely to impact natural ecosystems as well as agricultural and societal infrastructure.

Acknowledgments

[32] We thank Pao-Shin Chu and Cheri Laughran from the office of the Hawaiian State Climatologist and Robin Williams of the Office of the Puerto Rico State Climatologist for their advice regarding the reliability of data from specific stations in their areas. The staff of the Servicio Meteorológico Nacional (SMN) of Comisión Nacional del Agua (CNA) is greatly appreciated for the provision of Mexican data. Éva Mekis and Lucie Vincent, of Environment Canada, deserve acknowledgment not only for providing Canadian data but for adjusting them to account for inhomogeneities.

Ancillary

Advertisement