Geophysical Research Letters

Are climatic or land cover changes the dominant cause of runoff trends in the Upper Mississippi River Basin?

Authors


Corresponding author: C. D. Frans, Department of Civil and Environmental Engineering, University of Washington, 261 Wilcox Hall, Box 352700, Seattle, WA, 98195, USA. (chrisf2@u.washington.edu)

Abstract

[1] The Upper Mississippi River Basin (UMRB) has experienced a remarkable agricultural extensification since the mid-1800s. Hydroclimatological monitoring in the 20th century also reveals positive annual precipitation and runoff trends in the UMRB. While several studies have proposed land use/land cover (LULC) change as the primary cause of runoff increase, little is known about the dominant controls of hydrologic change in the UMRB. We used a macroscale hydrology model to assess the hydrologic implications of climate and LULC changes between 1918 and 2007. Modeling results, corroborated with hydroclimatologic data analysis, emphasized climate change as the dominant driver of runoff change in the UMRB. At local scales, modeled annual runoff decreased (increased) by up to 9% (5%) where grasslands (forests) were replaced by croplands. Artificial field drainage amplified annual runoff by as much as 13%. These findings are critical for water and nitrogen management in the UMRB under change.

1 Introduction

[2] The Upper Mississippi River Basin (UMRB) is arguably one of the leading examples of extensive land use/land cover (LULC) change in the United States (Figure 1). Conversion of natural vegetation in the region began as early as the 1850s with Euro-American settlement [Steyaert and Knox, 2008]. The natural (climax) vegetation of the UMRB consisted of grassland (17%), wooded grassland (51%), mixed forest (23%), evergreen needleleaf forest (7%), and deciduous broadleaf forest (2%) (Figure 1a) [Ramankutty and Foley, 1999]. Since the mid-1800s, approximately half of all the land within the 443,000 km2 basin has been converted largely to annual row crops of maize and soybean, at the expense of grasslands, wooded grasslands, and forests in the northwest (NW), southeast (SE), and central parts of the basin, respectively (Figures 1a and 1b). During the period of 1918–2007 for which hydroclimatic observations of reasonable quality are available, the basin's cropland fraction grew from 43% to as high as 58% in 1980 and has gradually decreased since then (to ~49% in 2007, Figure 1b) [Ramankutty and Foley, 1999].

Figure 1.

Maps of (a) climax vegetation of the UMRB and the two streamflow gages used in the study (outlet is the USGS gage 05587450 Mississippi River (MR) at Grafton, IL, and the inner gage is the USGS gage 05474500 at Keokuk, IA, with a drainage area of 71% of the UMRB), (b) net change in cropland fraction between 1918 and 2007 interpolated from Ramankutty and Foley [1999], (c) statistically significant (p < 0.05) local annual precipitation trends (mm/century) between 1918 and 2007, and (d, e) relative contribution of climate change in Figure 1d and LULC change in Figure 1e to modeled runoff trends (areas with no significant trends or where the relative contribution is zero are shown in white).

[3] Parallel with LULC change, the UMRB has also experienced climate change. Over the course of the 20th century, the climate of the region has become wetter [Milly and Dunne, 2001; Villarini et al., 2011], and the diurnal temperature range has narrowed [Bonan, 2001]. We developed gridded precipitation and temperature data in this study at a 1/8 degree spatial resolution (~12 km) using over 1000 observation stations for which data were acquired from the National Climatic Data Center (see Text S1 in the Supporting Information). The mean annual precipitation for the 1917–2007 period is ~800 mm over the entire UMRB. The data reveal positive trends in basin-averaged annual precipitation (+112 mm/century, p = 0.01), with local trends in the +65 to +240 mm/century (p < 0.05) range in the NW, central, and SE portions of the basin (Figure 1c). These annual trends are largely due to precipitation increases in July and August (+27 and +22 mm/century, p = 0.005 and 0.046, respectively, Figure S1), with little or no change in winter and spring. Here we define the observed changes in precipitation as climate change and acknowledge that this change in climate may be in fact induced by land use change and irrigation in an adjacent region [Kustu et al., 2011; DeAngelis et al., 2010]. Consistent with the trends in precipitation, positive trends in annual runoff and daily median and minimum streamflow have also been reported extensively during the 20th century in the UMRB [Lins and Slack, 1999; Groisman et al., 2004; Douglas et al., 2000]. Observed annual runoff trends are +82 mm/century at the outlet (Mississippi River (MR) at Grafton, IL) and +99 mm/century at MR near Keokuk, IA (see Figure 1a for gage locations).

[4] Contrary to precipitation, the calculated annual short-grass reference evapotranspiration (ref-ET) [Allen et al., 1998], a surrogate for atmospheric water demand for ET, exhibits a negative trend (−43 mm/century, p = 0.01) in the 1918–2007 period. The negative ref-ET trend is consistent with the narrowing diurnal temperature range from May through September (−2.6, −2.4, −3.4, −4.7, −3.1°C/century, respectively, with p = 0.005, p = 0.003, p < 0.0001, p < 0.0001, and p = 0.003), which might be attributed to higher evaporative [Bonan, 2001] and radiative cooling associated with enhanced seasonal precipitation [Milly and Dunne, 2001; Pan et al., 2004].

[5] Little is known about the relative influence of climate and LULC changes in shaping the hydrology of the UMRB at varying spatial scales. The majority of the previous studies have either examined the influence of climate change [Qian et al., 2007; Milly and Dunne, 2001; Pan et al., 2004] or LULC change [Zhang and Schilling, 2006; Schilling et al., 2010] on the UMRB hydrology, while only a few studies considered their joint influence [Tomer and Schilling, 2009; Mishra et al., 2010]. The work of Mishra et al. [2010] examined the hydrologic impacts of deforestation in Wisconsin, while Tomer and Schilling [2009] used a water-energy balance approach for watersheds in Iowa and Illinois. In this study we examine the relative contributions of the 20th century LULC and climate changes on observed runoff trends of the UMRB using a numerical model of land surface hydrology and hydroclimatologic data analysis.

2 Hydrology Model and Data Sets

[6] We used the Variable Infiltration Capacity (VIC) macroscale hydrologic model [Liang et al., 1994; Cherkauer et al., 2003], accompanied with an offline routing model [Lohmann et al., 1996], to simulate land surface hydrologic fluxes and streamflow under three different LULC scenarios in the UMRB. The model was implemented at a 1/8 degree spatial resolution (~12 km). VIC represents three layers in the soil column, with the lowest layer acting as a baseflow reservoir. Plant roots extract moisture from all three layers, controlled by plant physiologic characteristics and the evapotranspiration demand of the atmosphere. The model was forced with gridded meteorological data (precipitation, minimum and maximum temperature, and wind speed), covering the period of 1915–2007. The data were produced following the methods of Hamlet and Lettenmaier [2005], which reduces biases in the gridded meteorological forcing data from temporal and spatial inconsistencies of observed station records (Text S1). The model was calibrated for the 1942–1950 period using multiobjective parameter search algorithms [Yapo et al., 1998] to identify optimal first-order parameters that control runoff generation. For both the calibration and simulation periods, the calculated model Nash-Sutcliffe efficiency [Nash and Sutcliffe, 1970] for annual and monthly streamflow exceeded 0.80 (Text S2 and Figure S2).

[7] The LULC change of the UMRB was represented using annual vegetation class maps, developed by combining data for annual fractional cropland extent, with the potential “climax” vegetation of the region [Ramankutty and Foley, 1999] (Text S3). Annual row crops of maize and soybean constitute the majority of the croplands in the UMRB (>80%) [Monfreda et al., 2008]. Both crops have high aboveground biomass and are often planted in rotation [Donner et al., 2004]. Because of the uncertainties with crop rotation and the similarity of the major physiological parameters of maize and soybean, namely LAI [Twine et al., 2004; Lokupitiya et al., 2009], we used physiological parameters for maize to represent row crops in our model following previous model applications in the Midwest [Mao and Cherkauer, 2009; Mishra et al., 2010]. Twine et al. [2004] also noted little difference in water balance when maize and soybean are simulated explicitly in a land surface model. Irrigation is neglected in the model as the irrigated fraction of the basin is estimated to be ~1.3% [Siebert et al., 2007]. Seasonality of natural vegetation and stages of crop growth are represented by varying the biophysical parameters of each plant type in the model, based on Land Data Assimilation System data, adjusted to reflect UMRB conditions following Mao and Cherkauer [2009].

[8] Three LULC scenarios were developed to represent changes to the land surface. In a base case (first scenario), the cropland extent and vegetation types for 1918 were fixed throughout the 1918–2007 simulation period. For the second scenario, changes in LULC were represented by updating the fraction of each grid cell that is covered with cropland each year in accordance with Ramankutty and Foley [1999]. Biophysical parameters of the model were updated monthly as vegetation responds to seasonal climate. The third scenario was intended to represent the effects of widespread installation of artificial drainage in the west central and southeastern parts of the basin (~14% of the total UMRB area [Sugg, 2007]), by changing the parameters of the model's baseflow component through time to allow enhanced soil moisture loss from the root zone. The parameters used to represent the tile drainage process were identified from numerical experiments using the hydrology of the Raccoon River watershed in central Iowa (Text S4 and Figures S3 and S4), where artificial drainage is widespread [Sugg, 2007; Schilling et al., 2008]. This third scenario also used the dynamic representation of land cover from scenario 2.

[9] In large river basins, reservoirs and water management activities could influence streamflow, and therefore, they may need to be accounted for in hydrologic modeling [e.g., Haddeland et al., 2006]. However, in the UMRB, most water management structures are for navigation purposes (e.g., locks, low-head dams, and dikes) placed to maintain channel depth [Chen and Simons, 1986]. Vogel et al. [1999] reported the reservoir storage ratio (ratio of storage capacity to mean annual flow) of this region to be close to 0, among the lowest in the United States, which makes the basin suitable for exploring climate and LULC change impacts. In our model we assume that flow regulation and storage in the UMRB do not have any influence on the monthly and annual mean discharge. Additionally, within the monthly time scale of our model analysis, we assume that any human nonconsumptive use from UMRB returns to the system.

3 Results

[10] Modeled time series of annual and mean monthly runoff (streamflow volume divided by drainage area) at the UMRB outlet at Grafton, IL, and at Keokuk, IA, are shown in Figure 2. The model scenarios all show very similar annual runoff responses, with positive (~55 mm/century) but smaller than observed (~80 mm/century) runoff trends (in part because the model under-predicted high-flow years such as 1973 and 1993), which may be related to both model limitations in representing saturation patterns under extreme precipitation, as well as the measurement uncertainties of both precipitation (e.g., rain under catch and frozen precipitation) and flood discharge. At these gauges in the lower basin, LULC change only led to very subtle differences in the predicted mean monthly runoff on the order of a few millimeters.

Figure 2.

Observed and simulated annual and mean monthly runoff for the Mississippi River at (a, b) Grafton, IL (UMRB outlet) and (c, d) Keokuk, IA, for all three LULC simulations.

[11] The model results illustrate that at the UMRB scale, during the 1918–2007 period, a net ~10% increase in the basin-wide cropland extent with respect to the UMRB area had no significant influence on modeled annual runoff, while wetter climatic conditions in the basin dominated the modeled (and by inference, observed) runoff trends. We suggest two alternative explanations for this finding. It could be that either the model is not as sensitive to vegetation change as is the real system or, alternatively, directional differences in the local hydrologic response to climate and LULC changes compensate each other over the scale of the whole basin, leading to a relatively muted basin response to LULC, while the observed streamflow reflects the wetter climate signature over time with a positive trend.

[12] To evaluate the model's sensitivity to LULC change, we conducted hypothetical simulations on a single grid cell located in central Iowa. Considering 100% coverage of each vegetation type at a single cell, the model predicts 31% higher (11% lower) mean annual runoff in croplands than in forests (grasslands and wooded grasslands) (Figure S4). These results are consistent with earlier modeling work [Twine et al., 2004; Mao and Cherkauer, 2009] that reported a 3% to 60% increase (8% to 45% decrease) in runoff with forest-to-crop (grass-to-crop) conversion in this region depending on local climate, soils, and the type of crop. These differences are primarily a manifestation of the effects of differences in ET on streamflow.

[13] Given that the model clearly is sensitive to LULC change, we examined our alternative hypothesis. We first examined how LULC influences mean annual water fluxes, by plotting the percent differences in mean annual runoff (surface runoff + baseflow), baseflow, and evapotranspiration in the dynamic LULC scenarios 2 and 3, with respect to the static 1918 LULC simulation (Figure 3). In scenario 2, with respect to the static simulation, the local mean annual runoff is predicted to be lower on average ~5% (as high as 8.5%), and baseflow is predicted to be lower, up to 10%, in the NW to SE corridor of the basin where grasslands and wooded grasslands were converted to croplands (Figures 3a and 3b). In contrast, in the central and northern basin where some forest to cropland conversion occurred, modeled runoff was larger by as much as 5% during the simulation; however, those changes were short-lived as croplands contracted, leading to a slight (~2%) mean annual runoff increase for the 1918–2007 simulation period. In scenario 3, total runoff is predicted to be as much as 12% larger in artificially drained areas (~14% of the basin) (Figure 3d), which showed an even larger difference in baseflow up to 20%, compared to 1918 LULC conditions (Figure 3e). Across the UMRB basin, changes in runoff were compensated by changes in ET. Except for the regions with artificial drainage applications and forest-to-crop conversion, ET grew slightly in the basin (Figures 3c and 3f).

Figure 3.

Modeled percent differences in (a, d) mean annual total runoff, (b, e) subsurface flow, and (c, f) evapotranspiration between dynamic land cover (scenario 2) and 1918 LULC case (scenario 1) (upper row) and between dynamic land cover with artificial drainage (scenario 3) and 1918 LULC case (scenario 1) (lower row). Areas where drainage algorithms were applied are clearly seen where the percent change in baseflow exceeds ~12% in Figure 3e.

[14] The space-time averaged modeled annual runoff differences between the static 1918 LULC and dynamic LULC scenarios 2 and 3 are −3.8 and −0.09 mm (with artificial drainage), respectively. These subtle differences indicate that despite considerable changes in local water fluxes between static and dynamic LULC scenarios, changes in the mean annual runoff at the UMRB outlet are muted due to cancelling effects across the basin (Figure 3).

4 Discussion and Conclusions

[15] While a few studies have attributed the observed runoff trends to a progressively wetter climate in small agricultural watersheds [Tomer and Schilling, 2009] and at the scale of the greater Mississippi River Basin [Qian et al., 2007], LULC was identified as the major cause in most empirical studies that relate annual runoff to crop area change [Raymond et al., 2008; Zhang and Schilling, 2006; Schilling et al., 2010]. In our model experiments, all three LULC scenarios led to positive trends in modeled annual runoff, with very little difference in the mean annual runoff at the outlet of the UMRB among the different scenarios. These results support the hypothesis that growing regional precipitation is the dominant driver of positive runoff trends in the UMRB [Tomer and Schilling, 2009]. However, given the negative trend of ref-ET, it is critical to examine whether trend in actual ET amplifies the observed runoff trend in the basin.

[16] We mapped grid cells with statistically significant (p < 0.05) local trends in modeled annual runoff (surface + baseflow) and ET for all three scenarios for the 1918–2007 period (Figure 4). Similar spatial patterns of modeled runoff trends in Figures 4a and 4c and ET trends in Figures 4b and 4d suggest that most regions where LULC change has taken place after 1918 (Figure 1b) do not show a significant concentration of positive trends. Instead, the spatial patterns of positive runoff and ET trends in both static and dynamic LULC simulations resemble, much more closely, the spatial pattern of positive precipitation trends (Figure 1c). When artificial drainage is modeled, however, local runoff trends have a larger expanse and magnitude in the west central and southern basin, while ET trends became negative and mostly statistically insignificant (tile-drained region indicated by a dashed line in Figures 4e and 4f). Positive annual runoff and baseflow and negative ET (estimated from water balance) trends have been reported in tile-drained catchments in the UMRB [Schilling et al., 2008]. Despite the modeled local trends, however, we could not find trends in the modeled and inferred annual ET (from basin water balance) at the scale of the UMRB, consistent with reported lack of ET trends in the central Mississippi Valley [Qian et al., 2007]. The climatic signature visually prevails in these spatial trends; however, it is important to evaluate the relative influences of both climate and LULC quantitatively.

Figure 4.

Statistically significant modeled trends in local runoff and evapotranspiration for the (a, b) static 1918 LULC scenario, (c, d) dynamic LULC scenario, and (e, f) dynamic LULC scenario with artificial drainage. The locations where drainage algorithms were altered are enclosed with a dashed line in Figures 4e and 4f.

[17] To clearly demonstrate the spatial control of climate and LULC on the runoff trends, we mapped the relative contributions of climate and LULC changes to the trends of modeled annual runoff at each model element across the UMRB. The relative contribution of climate is mapped as the trend (p < 0.05) of modeled runoff with static 1918 LULC scenario (Figure 4a) divided by the modeled runoff trend with dynamic LULC and artificial drainage (scenario 3, Figure 4e) in each grid cell (Figure 1d). The LULC change contribution is mapped by first subtracting the static 1918 LULC scenario runoff trend from that of scenario 3 (Figures 4a, 4b, 4c, 4d, and 4e) and dividing that with the trend of scenario 3 (Figure 1e). The figure clearly shows the dominant control of climate in nearly all regions (>90%) where annual precipitation grew, despite the “muting” effects of cropland conversion in grasslands of the northwest basin. Tile drainage only dominates modeled runoff trends in small pockets of land in the southeast basin and contributes up to 40% to runoff trends in the central parts of the western basin.

[18] The model results provide evidence that at the UMRB scale, the 20th century streamflow trends are largely due to a progressively wetter climate (with higher precipitation and lower evaporation demand), rather than LULC change. For an empirical corroboration of this model result, we employed the statistical approach of Thompson et al. [2000], partitioning observed runoff trends into linearly congruent and independent components with respect to annual precipitation, for several major subbasins of the UMRB. We found that the fractions of the trends in observed annual runoff that are linearly congruent with the trends in annual precipitation are over 0.5, up to 0.89 in subbasins across the UMRB. The highest fraction belonged to the artificially drained Raccoon River. This empirical evidence underscores the precipitation signal on observed annual runoff in the UMRB (Text S5 and Table S1).

[19] Our model results also show that the large increase of croplands in areas that were formerly grass and wooded grasslands has slightly enhanced ET, muting the local runoff response to precipitation increase compared to the 1918 LULC conditions (Figure 4). Artificial drainage, in contrast, amplified both the magnitude and the trend of local runoff production due to enhanced baseflow contribution, consistent with local studies of water balance of tile-drained catchments in the region [Schilling and Libra, 2003; Tomer and Schilling, 2009].

[20] The UMRB is the source of the majority of the nitrogen fluxes into the Gulf of Mexico (GOM). If, in fact, climate change is the major driver of streamflow trends in the UMRB, amplified by agricultural practices such as tile drainage, this will present further challenges to managing nitrogen fluxes to the GOM, as biofuel crop production grows to meet the 2020 biofuel target of the nation, under growing precipitation extremes projected for the 21st century climate [Mehaffey et al., 2012].

Acknowledgment

[21] Paolo D'Odorico thanks two anonymous reviewers for their assistance in evaluating this paper. The authors would like to thank the two anonymous reviewers whose comments improved the manuscript.

Ancillary