Corresponding author: C. Tague, Bren School for Environmental Science and Management, University of California, Santa Barbara, CA 93106, USA. (email@example.com)
 Forest responses to warming, in the absence of changes in vegetation structure, reflect the balance between the increased atmospheric demand for water and changes in water availability. This study uses a coupled hydroecologic model applied to a snow-dominated mountain watershed to demonstrate how complex topography and interannual variation in climate drivers combine to alter the balance between moisture availability and energy demand. We focus specifically on how often and under what conditions changes in the timing of soil water recharge as precipitation or snowmelt are a significant control on forest actual evapotranspiration (AET) in the Central California Sierra. We show that while interannual variation in precipitation is the dominant control on interannual variation in AET, how much of that recharge accumulates as a seasonal snowpack can act as a second-order control. This sensitivity of AET to snow accumulation and melt occurs across a substantial elevation range (1800–2700 m) and at both aggregate watershed and 90 m patch scales. Model results suggest that the variation in AET due to recharge timing is greatest for patches and years with moderate levels of precipitation or patches that receive substantial lateral moisture inputs. For a 3°C warming scenario, the annual AET increases in some years due to warmer temperatures but decreases by as much as 40% in other years due to an earlier timing of snowmelt. These results help to clarify the conditions under which water availability for forests decreases and highlight scenarios that may lead to increased drought stress under a warming climate in snow-dominated mountain regions.
 Projected and observed hydrologic consequences of climate change reflect the intensification of the hydrologic cycle—more intense and more frequent droughts, increases in extreme precipitation events, and associated flooding and greater evaporative forcing [Wentz et al., 2007; Willett et al., 2007; Seager et al., 2009]. Because vegetation and soil biogeochemical cycling are sensitive to the timing and magnitude of water availability, terrestrial ecological processes are tightly coupled to these hydrologic changes [Lohse et al., 2009], particularly in annually or seasonally water-limited regions [Hanson and Weltzin, 2000]. Vegetation and soil responses may in turn influence evaporative fluxes and moisture and energy feedbacks to the atmosphere [Pielke et al., 1998; Small and Kurc, 2003]. Within the western U.S., water availability is an important control on ecosystem productivity [Case and Peterson, 2005; Boisvenue and Running, 2006] and vulnerability to disturbances, including fire [Westerling et al., 2006; McKenzie et al., 2008], insects, and drought-related mortality [Westerling et al., 2006; van Mantgem et al., 2009; Allen et al., 2010; Bentz et al., 2010]. In the more semiarid areas of the western U.S., most years show evidence of summer water limitation [Hanson and Weltzin, 2000; Ryel et al., 2008; Goulden et al., 2012]. In these regions, actual evapotranspiration (AET) can be more than 80% of the annual budget, and downslope effects of forest water use on streamflow can be large, although the impacts of changes in forest productivity on streamflow vary with climate, species, understory dynamics, and the partitioning between evaporation and transpiration [Brown et al., 2005; Mast and Clow, 2008; Tague and Dugger, 2010; Guardiola-Claramonte et al., 2011]. Under a warming climate, higher temperatures are likely to lead to greater water demand and potentially earlier summer water stress. Elevated CO2 may ameliorate some climate-driven increases in water stress. Many field-based studies, however, do not show evidence of this effect, and the primary effect of elevated CO2 appears to be increased growth rather than reduced water use [Soulé and Knapp, 2006]. Warmer temperature can also lengthen the growing season, leading to greater spring vegetation productivity, although this may be balanced by higher respiration and reductions in productivity due to water limitation later in the summer [Huxman et al., 2003; Goulden et al., 2012]. Earlier warm spring temperatures may combine with higher vapor pressure deficits, leading to increased water loss in the spring and more rapid soil water depletion later in the summer [Huxman et al., 2003]. Higher temperatures and higher vapor pressure deficits during the summer can also result in stomatal closure and reduced evapotranspiration and productivity [Goldstein et al., 2000].
 Much of the forested landscape in the western U.S. is located in snow-dominated mountain watersheds that are expected to show strong sensitivity to climate warming [Barnett et al., 2005]. These regions are unique as locations where warming can alter the timing of water inputs directly by changing snowpack dynamics. In the mountains of the western U.S., earlier snowmelt and reduced snow accumulation due to warming temperatures have been linked with earlier snowmelt runoff, increased peaks, and reductions in summer streamflow [Barnett et al., 2005; Rood et al., 2008]. For example, Stewart et al.  showed statistically significant shifts up to 4 weeks in the timing of center of mass of streamflow (since 1948) in a broad regional analysis of watersheds within the western U.S. For forests, the seasonal synchronicity between the timing of water input and the timing of energy demands can also be an important control on the water availability for transpiration. In much of the western U.S., the timing of peak precipitation input occurs during the winter and is offset from peak energy availability for plant water use and evaporation. Storage of this water as snow extends the timing of water input (as snowmelt) to later in the season. Because the dominant water input occurs during the winter or spring snowmelt period, AET typically declines during the late summer as soils are depleted of available water, and AET becomes less than potential evapotranspiration (PET). Shifting from snow to rain typically leads to earlier water input. The implications of this change in the timing of recharge, however, will likely depend on a range of factors, including rates of snowmelt relative to forest activity during the spring snowmelt period, soil storage capacity, and summer moisture and energy inputs. On the one hand, earlier or more rapid snowmelt can potentially increase wet season runoff, thus decreasing soil recharge and exacerbating summer water limitations. On the other hand, for a given annual precipitation input, temperature increases can increase PET, increasing growing season length during which forests are actively transpiring. Thus, whether annual AET increases with warmer temperatures and earlier snowmelt will depend on whether increases in summer water limitations balance the increase in spring PET and AET.
 Previous empirical and model-based analyses demonstrate a relationship between vegetation productivity and climate drivers, including snow accumulation and melt. An elevation-based transition between temperature and water-limited environments is well documented in the western U.S. Tree ring-based studies of interannual variation in growth show clear spatial patterns with increased growth with warmer temperatures at high elevations, transitioning to increased growth with higher precipitation at lower elevations [Bunn et al., 2005; Case and Peterson, 2005; Nakawatase and Peterson, 2006]. Recent empirical work suggests that the amount of snow may be an important control on forest productivity. Remote sensing of interannual variation in greenness for the California Sierra shows that vegetation greenness in middle elevations between 2000 m and 2600 m has the greatest positive correlations with prior winter snowpack accumulation [Trujillo et al., 2012]. Since snowpack accumulation is highly correlated with total precipitation inputs, greater vegetation greenness may be due to greater total precipitation in high snow years or to later recharge timing associated with more snow versus rain and later melt. Thus, the implications for vegetation water availability of a temperature-driven change in recharge timing due to less snow accumulation and earlier melt remain unclear. Model-based studies vary in their estimates of evapotranspiration changes with warming across the west. For example, Hamlet et al.  applied a hydrologic model to historic climate (1947–2003) for the western U.S. and showed a shift in the timing of AET to earlier in the year associated with warming trends in the past few decades as well as declines in the total AET in regions, like California Sierra, where summer precipitation is small. However, Leung et al.  projected increases in AET for the Sacramento-San Joaquin basin in California Sierra for future climate scenarios derived from the National Center for Atmospheric Research/Department of Energy Parallel Climate Model, even though precipitation for the basin was projected to decrease. Similarly, Null et al.  used the hydrologic model WEAP21 estimate increases in evapotranspiration for most large basins in the California Sierra under 2°C, 4°C, and 6°C temperature warming scenarios.
 Understanding whether forests are water or temperature limited and, consequently, the implications for climate sensitivity in mountain environments requires disentangling the multiple controls on energy and moisture availability. Elevation and aspect control variation in PET as well as topographic gradients in precipitation are commonly incorporated into models of forest productivity and used to infer situations under which forest are likely to be water limited. To account for the impact of timing (and shifts from precipitation as rain or snow), assessments must go beyond annual time scale metrics such as Budyko curve approaches [e.g., Donohue et al., 2011; Jones et al., 2012]. Furthermore, other hydrologic processes that control local moisture availability must be considered. In particular, lateral moisture redistribution can lead to nonlocal moisture inputs and substantially change the balance between energy and moisture controls in areas of high lateral moisture convergence such as riparian areas and wet meadows [Thompson et al., 2011; Hwang et al., 2012].
 In this paper, we use a process-based hydrologic model to assess whether subannual time scale processes, specifically the accumulation and melt of snow, are a significant control on interannual variation in AET. We address this question at the local patch scale (90 m patch) and lumped watershed (third-order stream) scale and for historic climate interannual variation and for a 3°C warming scenario. We apply our model to a case study watershed in the Central California Sierra that has a strong elevation gradient and substantial spatial variation in the timing of snowmelt. We use this case study watershed to quantify how complex topography and interannual variation in climate combine to influence the balance between moisture and energy controls and the sensitivity of this balance to the timing of recharge (as precipitation or snowmelt). Our modeling approach allows us to examine both the role played by topographically mediated spatial variation in energy and moisture forcing and topographically driven soil moisture redistribution. For this paper, we focus only on initial, short-term forest responses. Subsequent work will examine additional changes in AET fluxes associated with climate-driven changes in forest structure and composition, which may substantially alter long-term AET rates in the western U.S. [Tague and Dugger, 2010]
2.1 Modeling Approach
 The regional hydroecologic simulation system (RHESSys) [Tague and Band, 2004] is a physically based, spatially explicit model of coupled hydrology and biogeochemical cycling processes. Previous applications of RHESSys have shown that it can successfully model both hydrologic and carbon cycling behavior in the mountains of western North America and Europe [e.g., Mackay et al., 2003; Zierl et al., 2007; Tague and Grant, 2009]. As a spatial model, RHESSys partitions the watershed into patches, defined based on topography, vegetation, and soils. Within a watershed, hillslopes, draining to either side of stream tributaries, are defined based on topography and account for major aspect differences in radiation forcing. Patches within hillslopes allow lateral moisture redistribution. Tague and Band  presented a complete description of RHESSys algorithms; however, a brief summary is given here. Energy, wind, and water are attenuated through the aboveground canopy using standard approaches (e.g., Beer's law for radiation extinction as a function of vegetation leaf area index (LAI)). Snowmelt is estimated using a combination of an energy budget approach for radiation-driven melt with a temperature index-based approach for latent heat-driven melt processes. Evaporation and transpiration fluxes are computed for several vertical layers in RHESSys. For each vertical layer, we compute energy, wind, and water fluxes, their balance, and transfers. In this paper, we use a simple canopy model, combining overstory and understory canopy transpiration. However, we compute evaporation from intercepted canopy water and from litter and soil separately. All water fluxes, including evaporation of intercepted water in canopy, litter, and soil as well as transpiration, are computed using a Penman-Monteith approach [Monteith, 1965]. Aerodynamic conductance is varied as a function of wind speed following Heddeland and Lettenmaier . For soil and litter evaporation, surface conductances vary as a function of moisture content following Williams and Flanagan . For vegetation, stomatal conductance is computed using a Jarvis multiplicative model [Jarvis, 1976], accounting for radiation, vapor pressure deficit, rooting zone soil moisture, CO2, and temperature controls. We compute transpiration separately for sunlit and shaded leaves and scale these by respective sunlit and shaded leaf area based on Chen et al. .
 RHESSys uses a four-layer model to simulate vertical soil moisture fluxes between four stores: surface detention, above and below rooting zone unsaturated and saturated stores, and a seasonal snowpack. Vertical fluxes include snowmelt, infiltration, drainage from the unsaturated (rooting zone and below rooting zone) layers, capillary rise, transpiration, and soil and litter layer evaporation. Lateral fluxes of saturated zone and surface water are functions of topography and soil hydraulic conductivity. Subsurface hydrologic flux representation in RHESSys depends on several key hydrologic parameters, including pore size index (po) and air entry pressure (pa), which, together with rooting depth, control soil moisture storage capacity. Vertical and lateral drainage rates are varied as a function of soil saturated hydraulic conductivity parameter (Ksat) and its decay with depth (m). Note these parameters will influence the timing and magnitude of lateral moisture redistribution. We also include a parameter to account for bypass to deeper groundwater stores (gw1) and a separate control on deeper groundwater drainage rates (gw2). Because effective soil drainage and storage parameters at watershed scales cannot be directly estimated from field measurements, we use calibration against observed streamflow to estimate these parameters.
 For our study site, we show that following calibration, the model captures key components of the streamflow hydrograph, including interannual and seasonal patterns. Details of the study site and calibration are provided below. We acknowledge that reasonable model estimates of streamflow do not fully constrain model estimates of spatial patterns of forest water use. Given this, we consider model estimates of forest water use patterns as our “best available” estimates given a set of physically realistic assumptions. Model estimates of forest water use-climate interactions illustrate the implications of current assumptions about (a) spatial patterns of energy and moisture availability in complex topography and (b) forest water use responses to available moisture and energy (using a combination of Penman-Monteith estimate and Jarvis models of stomatal conductance). Model estimates conserve both moisture and energy at daily time step and for both patch and aggregate watershed spatial scales.
 In hydrologic science, shifts between water and energy limitations on AET are often characterized using the Budyko curve [Budyko, 1984], which plots the relationship between actual to potential evapotranspiration ratios (AET/PET) against actual evapotranspiration to precipitation ratios (AET/P) [Chen et al., 1999]. Budyko curves and other similar annual scale metrics typically focus on long-term average total annual energy (PET) and moisture (P) drivers [Zhang et al., 2008; Donohue et al., 2010; Greenwood et al., 2011]. Year-to-year variation in the Budyko relationship, however, can provide insight into the degree to which water or energy tends to dominate over time [Yang et al., 2007; Potter and Zhang, 2009]. To explore the implications of changes in snowmelt timing on AET in the context of energy versus water availability, we consider both interannual variation in Budyko relationships and interannual variation in AET/P. To estimate AET and PET, we use RHESSys at two scales: a patch (90 m) and an aggregate watershed scale. We compute the annual AET and PET for 40 years of historic climate variation for our study site. We examine interannual as well as spatial variation in AET (for a patch scale) given historic climate and relate ET differences to energy (PET) and moisture (precipitation and SWE (snow-water equivalent)) controls. We also investigate the role played by lateral moisture redistribution—or the routing of water between patches by running the model with and without lateral moisture redistribution. To investigate how warmer temperatures might alter these relationships, we repeat our analysis using a 3°C temperature increase applied uniformly across the historic daily climate record for water years 1960 to 2000. While future climate is likely to include changes in both precipitation and temperature, we focus here on the impact of temperature change in isolation and use historic interannual variation in precipitation to examine the impact of changes in temperature along a continuum of dry to wet years. Recent global climate model estimates for the western U.S. have shown temperature increases of between 1.5°C and 4.5°C over the next 50 years [Cayan et al., 2008]. Thus, we use 3°C as a moderate representation of the expected temperature increase for our analysis. Temperature increases can impact RHESSys ecohydrologic estimates both directly through changes in temperature controls on snow accumulation and melt and stomatal conductance and indirectly through increases in vapor pressure deficit, which also influences both stomatal conductance and AET flux estimates. Interactions between spring snowmelt and evapotranspiration and soil moisture recharge will have consequences for late summer water availability and AET.
2.2 Study Site
 Sagehen Creek is a 26 km2 watershed located approximately 30 km north of Lake Tahoe in the central Sierra Nevada (Figure 1). Elevations range from 1800 to 2650 m. Mean annual precipitation is 880 mm, with most of the precipitation falling as snow between November and April. Vegetation is dominated by mixed conifer forest (predominantly, Lodgepole pine and Jeffrey pine at lower elevations and riparian zones; White fir and Jeffrey pine at middle elevations; and Red fir, White pine, and Mountain hemlock at higher elevations). Numerous meadows are located along riparian areas along the lower reaches of Sagehen Creek as well as in various spring-fed locations at higher elevations. Watershed soils are predominately Andic and Ultic Haploxeralfs and are underlain by Miocene andesite, rhyolitic ash, and mudflow deposits, which in turn overlie the granites and diorites of the Sierra Nevada batholith.
 Required climate inputs to RHESSys are daily minimum/maximum temperatures and precipitation. For this study, we use daily data from the Sagehen East Meadow weather station (elevation 1932 m), available through the U.S. National Climate Data Center. Both temperature and precipitation are distributed spatially in RHESSys using elevation-based scalars. For temperature scaling, we derive minimum and maximum temperature lapse rates by comparing temperatures at the East Meadow Station with those at the Independence Lake meteorology station located near the upper boundary of the watershed (2545 m). For daily minimum and maximum temperatures, we use lapse rates of 3°C/km and 6°C/km. respectively. For precipitation, we scale Sagehen East Meadow weather station data as follows:
where Pi and ei are precipitation (mm/d) and elevation (m) at location i in the watershed, respectively. Pb and eb are precipitation (mm/d) and elevation (m) at Sagehen East Meadow, respectively. k is manually calibrated to 0.0001 from the streamflow and observed precipitation at Independence Lake.
 Vegetation in RHESSys is initialized using a combination of a U.S. Forest Service dominant vegetation species map and LAI, derived from Thematic Mapper remote sensing data for August 1995, following the approach described in White et al. . Standard allometric relationships are used to derive vegetation carbon stores, including root carbon and root depth. The species-type map is used to assign ecophysiological parameters, as described by Tague and Band . The species-specific parameter values are based on a detailed literature survey done by White et al. .
2.3 Model Calibration and Performance
 Streamflow data for calibration are taken from the U.S. Geological Survey maintained gauge (#10343500) at the outlet of the watershed. Calibration varies with RHESSys soil drainage parameters, following approaches used in Tague et al. . In addition to varying the five soil parameters (m, Ksat, pa, po, gw1, and gw2) described above, we also include a multiplier on rooting depth and vary its value between 0.8 and 1.3 to account for uncertainty in rooting depth, which is likely to be a key control on forest water use. The initial estimates of rooting depth are based on RHESSys dynamic simulation for forest carbon stores. We use water years 1965 to 1985 for calibration and 1960 to 2000 for subsequent simulations. Table 1 summarizes the performance of the model for water years 1960–2000 of all parameter sets that were considered acceptable. We use multiple performance measures to evaluate model streamflow performance, including measures that demonstrate reasonable performance at interannual, seasonal, and daily time scales. We argue that model performance is adequate for analysis presented here and similar or better than streamflow performance in other hydrologic modeling studies in the Sierra [e.g., Hamlet et al., 2005; Null et al., 2010]. We note that if the model is run using the 3°C warming scenario, performance, measured as the Nash-Sutcliffe efficiency (NSE) of log-transformed daily flow, drops to 0.2 from 0.7, suggesting that the effect of climate variation is large relative to model errors. Similarly, Figure 2 shows that model estimates of average streamflow by day of year over the simulation period follow a similar pattern to observed flows, and errors are small relative to the change in this seasonal hydrograph associated with the 3°C warming scenario. Calibration against streamflow, however, does not completely resolve error in soil parameter estimates [Beven and Freer, 2001], and adjustment of soil parameters may, in some cases, compensate for errors in other parameters, including precipitation and temperature forcing data [Garcia et al., 2013]. To examine the implications of this parameter uncertainty, we repeat analysis of relationships between AET and other variables, including precipitation, snowpack, and temperature warming (discussed in more detail below) for a range of acceptable soil and rooting depth parameter sets. Acceptable parameters are those that achieve a NSE above 0.65 for both daily and log-transformed daily flows over the full calibration period and a bias in mean annual flow of less than 15%. Our spatial analysis requires estimate ecohydrologic fluxes for multiple points in time and space. Evaporation and transpiration are computed for approximately 70,000 time-space points for several different scenarios. Given this large number of model outputs, computational limitations preclude the use of approaches which repeat simulations across soil parameter uncertainty. For spatial analysis, however, we use a single parameter set. Table 1 summarizes statistics of acceptable parameter sets and set chosen for spatial analysis. We select the set that gives basin-scale estimates of mean annual AET and AET relationships with forcing variables that are closest to the average results across all acceptable parameters. We use Euclidean distance to define closeness.
Table 1. Performance Metrics for RHESSys Streamflow Estimates for Water Years 1960–2000a
Selected Parameters Set
NSE is the Nash-Sutcliffe efficiency, and NSE (log) is the NSE for log-transformed flows. R2 is the Pearson correlation R2. Bias is computed as (mean(model) − mean(observed))/mean(observed). Metrics are computed for daily, monthly, and yearly temporal aggregation. DofCM is the day of center of mass of flow and reflects the ability of the model to capture year-to-year variation in flow timing. Min7d is the annual minimum 7 day flow. The first value shows the measures for the parameter set selected for spatial analysis, and the second value shows the uncertainty bounds for the set of acceptable parameters.
NSE (log) daily
Pearson R2 annual flow
Bias annual flow
Pearson R2 DofCM
Pearson R2 Min7d
3.1 Watershed-Scale Interannual Variation in ET
 Plotting the mean annual AET, PET, and precipitation (P) fluxes for the Sagehen Creek watershed on the standard Budyko curve suggests that this watershed falls near the transition between water- and energy-limited watersheds (Figure 3a). Interannual variation in these fluxes, however, leads to a wide range of water- to energy-limited conditions (Figure 3a). Estimated AET versus precipitation shows a similar pattern and demonstrates that high interannual variation in precipitation explains much of the variation in forest water use (Figure 3b). We note that transpiration is typically more than 80% of annual AET. Thus, while for analysis in this paper we report AET, results largely reflect forest water use. Watershed-scale AET increases relatively linearly with precipitation until a threshold precipitation of approximately 1200 mm/yr is reached (Figure 3b). While years with precipitation above 1200 mm show a less direct water limitation effect, we note that even in wet years, AET is typically only 80% of PET, suggesting some degree of water limitation.
 For a given annual precipitation input, different years can show substantially different AET estimates. For example, 6 years have annual precipitation between 750 mm and 800 mm (Figure 3b), with corresponding AET ranging from 500 to 680 mm. This difference in AET is 23% of the 40 year range in AET. We note that two of these years, 1968 and 1961, also appear as deviations in the Budyko curve formulation (Figure 3a). In this case, departures from the mean AET versus P relationship as well as the Budyko curve reflect differences in the timing of water inputs. If more precipitation falls as snow, the timing of recharge occurs later in the year when forest water demand is higher. Figure 4 shows seasonal patterns of AET and snowpack for 1961 and 1968, the years with similar precipitation inputs, and demonstrates how early timing of recharge leads to greater summer moisture stress. In 1961, only 49% of precipitation fell as snow, and peak annual SWE was 135 mm, in contrast to 1968, where 65% of precipitation fell as snow and peak SWE was 245 mm. However, while greater peak SWE is generally correlated with higher AET, this is not always the case. In wetter and warmer years, relatively lower peak SWE can also have relatively higher AET. Annual AET in 1984, for example, was high relative to other years with similar total annual precipitation (831 mm versus 720 mm in 1993). Peak SWE in 1984, however, was lower than other years with similar precipitation (for example, peak SWE is 454 mm in 1984 versus 714 mm in 1993). In 1984, a warmer spring with sufficient water inputs resulted in higher AET early in the season. Significant summer precipitation further increased AET in 1984.
 These results emphasize that while annual precipitation is the dominant control on interannual variation in AET, energy availability, temperature limitations, and interannual variation in the timing of water inputs also lead to substantial departures from a long-term average AET versus P curve. Some of the variations in the AET versus P relationship reflect interannual variation in the timing of recharge and the accumulation of melt and snow. However, the effect of interannual variation in snowmelt timing on annual AET is a complex one and varies with the magnitude of water inputs, spring and summer precipitation, and patterns of spring energy availability. A linear regression of annual AET against P shows that annual precipitation is a strong predictor of estimates of interannual variation in mean basin AET (Table 2). Precipitation as a significant control on interannual variation in AET is robust across uncertainty in soil drainage and rooting depth parameter uncertainty, although the strength of the relationship (or sensitivity to precipitation variation) changes with assumptions about soil parameters (Figures 5a and 5b). In general, simulations using parameters associated with lower soil water holding capacity and shallower rooting depths show greater sensitivity to interannual variation in precipitation.
Table 2. Linear Least Squares Regression of AET (and Change in AET) Against P, Peak SWE, and Timing of Recharge for Basin and Patch Scalesa
For all regression, all variables are normalized deviations from means values. Results are shown for simulations using the parameter set selected for spatial analysis. Figure 5 shows variation in slopes at the basin scale across soil parameters.
 Using peak SWE alone is a slightly poorer predictor, relative to precipitation, of interannual variation in AET (Table 2). Peak SWE, however, is significant as a secondary explanatory variable after the effect of annual P is accounted for. Thus, interannual variation in AET reflects, first, variation in the magnitude of water input and, second, variation in the timing of water inputs or how much of this water falls as snow and accumulates to form a seasonal snowpack. We note that combining peak SWE with P increases the R2 in estimating interannual variation in AET on average from 0.65 to 0.75, relative to regressions using P alone. The increase in R2 occurs for all acceptable soil parameters. In general, there is a statistically significant and negative interaction effect between peak SWE and P, such that the increase in AET associated with greater peak SWE diminishes for years with greater P.
3.2 Patch-Scale Interannual Variation in ET Due to Local Moisture/Energy Forcing
 At the watershed scale, substantial departures from the watershed-scale Budyko or AET-P curves occurred only in a few years (e.g., 1974). Initially, we examine patch-level simulations where there was no lateral redistribution of water. These scenarios allow us to directly examine the differences in AET due to elevational gradients in temperature, precipitation, and topographic patterns of radiative forcing. The scatter around the AET-P curve is much greater at local (90 m) patch scales (Figures 6a and 6b) relative to basin-scale analysis. For all elevations, AET versus P relationships include both primarily energy- and water-limited, years under historic climate forcing (Figure 6a). Thus, even wetter, colder high elevations have years with significant water limitation (lower AET), and the drier, warmer low elevations show years with reduced water limitation (higher AET). The greatest variation in AET for a given precipitation occurs for wetter years with precipitation greater than 1200 mm. As with basin-scale analysis, we use peak SWE while accounting for total precipitation as a proxy for the effect of timing of recharge. At the patch scale, the combination of interannual and cross-patch scale variation in AET is significantly related to variation in precipitation. Including peak SWE in addition to total precipitation to estimate variation in AET increases R2 of linear regression from 0.56 to 0.64 (Table 2). There is also an interaction effect at this scale, such that the effect of greater peak SWE diminishes in wetter years.
 Figure 6a also shows that relatively high elevations (2300–2500 m) not only tend to receive greater precipitation but also transpire a greater proportion of this input. These elevations also have a higher LAI (leaf area index) supported by long-term greater mean water inputs, leading to greater mean AET. The highest elevations >2500 m show some reduction in AET for a given precipitation amount, reflecting a greater degree of temperature/energy limitation. AET patterns in drier years show declines across all elevations and suggest that even high elevations are sometimes water limited. Despite a general pattern of water limitation, there are frequent examples (not shown) where lower elevations have greater AET than higher elevations given the similar annual precipitation inputs. As with the basin-wide characterization, the importance of precipitation timing results in a complex relationship between elevation, precipitation, and annual AET.
3.3 Impact of 3°C Temperature Warming on ET Patterns at Patch and Watershed Scales
 A 3°C warming substantially alters snow accumulation and melt, reducing mean watershed peak SWE from 370 to 128 mm, and the average day of complete snowmelt occurs approximately 1 month earlier (Figure 7). At the basin scale using our selected soil parameter set, a 3°C warming results in a mean increase in AET of only 11 mm (<2% of mean annual AET (641 mm)), although there are years where AET increases by as much as 82 mm (11%) or decreases by as much as 32 mm (7%). Ranges in basin-wide average changes in AET with 3°C warming across soil parameter uncertainty are similar, with a slight decrease in AET when averaged across all soil parameters of 22 mm (3% of mean annual AET). For specific years and soil parameters, increases as high as 35% or decreases in AET as high as 40% occur. Changes in AET aggregated to the watershed scale have implications for total water balance and streamflow. For the 3°C warming scenario, the associated mean increase in annual watershed streamflow is less than 3%. However, there are years where higher AET rates lead to declines in streamflow by as much as 30% or lower AET rates lead to increases in streamflow by as much as 20%.
 We now consider patch-scale heterogeneity in annual AET responses given a uniform 3°C warming applied to all years. Initially, we examine results without including lateral redistribution. Conceptually, we may expect that colder higher elevations would show greater temperature limitation and thus increased AET with warming, while warmer lower elevations would show greater sensitivity to shifts in the timing of recharge. Figure 6b shows the distribution over all patches of the change in annual AET (for each year of the 40 years that were recorded) given the 3°C increase in temperature. Estimate of changes in AET with warming across the distribution of 90 m patches shows substantial variability, with changes in AET ranging from decreases of 200 mm/yr to increases of 50 mm/yr, although most years are within ±50 mm/yr of previous values (Figure 6b). The estimated change in AET with warming does not vary in consistent ways with elevation (colors in Figure 6b) or precipitation. In general, however, wetter years and higher elevations show the greatest declines in AET and greater variation from year to year. We note that variation (year-to-year or spatial) in changes in AET with warming is greater once the threshold precipitation (of approximately 1400 mm/yr) is reached. This threshold is similar to the threshold precipitation at which AET/PET ratios no longer respond in a linear way to increasing precipitation. Thus, wetter years (>1400 mm/yr), in general, show the greater declines in AET than drier years.
3.4 The Impact of Lateral Moisture Redistribution on ET Fluxes at Watershed and Patch Scales
 Figures 6a and 6b assume no lateral moisture redistribution. Including lateral distribution of moisture alters AET under historic climate and the impact of warming on AET, at both basin and patch scales (Figures 3a, 8a, and 8b). At the basin scale, including redistribution increases mean basin AET by 60 mm/yr or by about 10%. Thus, redistribution of excess water leads to more water lost as evapotranspiration and is potentially used to support vegetation productivity. Lateral redistribution also alters the pattern of AET sensitivity to precipitation (Figure 6a versus Figure 8a). Elevation effects are more pronounced when lateral redistribution is included. AET rates are generally higher across all elevations when lateral redistribution is included, but lower elevations disproportionately benefit from lateral redistribution, both because they receive greater lateral inputs and because their earlier onset of spring allows them to use more of this additional water.
 Changes in AET with the 3°C temperature scenario are similar for model runs that include lateral redistribution and those that do not (Figure 9), with most years showing changes of less than 10%. At the patch scale, however, there is greater spatial variation in responses to warming when lateral redistribution is included (Figure 9). The standard deviation across all patches and all years of the change in AET with the +3°C temperature scenario increases from 37 to 54 mm/yr when lateral redistribution is included. Lateral redistribution increases the net water input to many patches, and thus, we would expect increased AET for many patches. Average patch- or basin-scale AET does increase with lateral redistribution, as noted above. It might be expected that this lateral subsidy would also lead to a greater likelihood of increases in AET with warming since greater water inputs (for receiving patches) can support the increased demand with warming. Patches receiving lateral subsidy might also be less sensitive to shifts in the timing of snowmelt due to this additional water. Greater gains in AET with climate warming with lateral redistribution do occur for about 21% of all patches in all years (Figures 6b and 8b). However, water as lateral subsidy is sensitive not only to local conditions but also to upslope conditions. Greater shifts in the timing can occur for higher elevations, and lateral redistribution can transfer this effect to lower elevations, which also tend to receive more lateral inputs. Results for the 3°C warming scenario do show some instances (about 15% of all individual patches in all years) of greater declines in AET when redistribution is included (Figures 6b and 8b).
 Lateral redistribution also generally increases spatial variation in AET for a given annual precipitation, under both historic and climate warming scenarios (Figure 10). For historic climate, the coefficient of variation (CV) tends to decline in wetter years for both model runs with and without lateral redistribution. Under the 3°C warming scenario, however, simulations without lateral redistribution show a decline in CV with lower precipitation. This reflects the state at which the entire watershed tends to become water stressed (thus, there is less spatial variation in AET). It is notable that this decline in CV with precipitation does not occur in AET estimates using historic climate nor using the 3°C warming scenario with lateral redistribution. Under historic climate, the variability model maintains a relatively high spatial variation in AET in the driest years, and the CV declines under wetter years. The lower CV in wetter years reflects the state at which the entire watershed is no longer water limited, so there is less spatial variation in AET.
 The annual magnitude of lateral inputs does not change substantially with warming (mean annual lateral input for patches that are net receivers of water decreases to 544 mm from 564 mm under the 3°C warming scenario), and there is no statistically significant relationship between change in the magnitude of lateral input and change in AET with warming. Thus, in many instances, greater declines (or increases) in AET with warming given lateral redistribution reflect a shift in the timing of lateral redistribution from upslope contributing patches to earlier in the year, when downslope receiving forests are less able to make use of this water.
 Short-term vegetation responses to warming, in the absence of changes in vegetation structure, reflect the balance between the increased atmospheric demand for water and either increases or decreases in water availability. This work summarizes estimates from a physically based model that captures relatively well-understood water balance and energy controls on AET. As a model, it reflects the implications of current understanding of how these controls may influence spatial-temporal patterns of vegetation water use in snow-dominated systems given climate variability and change. Actual ecosystem responses are likely to be more complex. Fine-scale heterogeneity in soil drainage characteristics, microclimates, vegetation structure, and species difference that are not included in the model may be a dominant influence in some locations. Results here also focus on short-term responses and do not account for longer-term changes in forest composition and structure. There is also substantial uncertainty in model estimates due to potential errors in representing the spatial pattern of climate inputs as well as soil and vegetation parameters. We use calibration against streamflow to reduce some of this uncertainty and ensure that model estimates of vegetation leaf area index and net photosynthesis are within literature ranges for this region. At the basin scale, we evaluate the sensitivity of our results to soil and rooting depth parameter uncertainty and find that while the magnitude of certain results are sensitive to this uncertainty, the overall conclusions are not. Thus, we argue that these model results provide a template for looking at what may be the typical range of responses in snow-dominated mountain headwater systems, given well-known energy and moisture controls on AET.
 Results presented here point to general relationships that will have implications beyond the Sagehen Creek study watershed and reflect patterns that are likely to occur over a wide range of snow-dominated environments. We note that this study on watershed comprises locations that span a range of temperatures and precipitation inputs—and include the elevation range identified by Trujillo et al.  as showing strong vegetation responses to seasonal peak snow accumulation in the California Sierra. Our results suggest that greater vegetation greenness with greater snow accumulation reflect both greater precipitation inputs in high snow years and the effect of later recharge timing associated with larger snowpacks. Model estimates, for all patches between 1800 and 2700 m elevation, show that within-year variation in the timing water input can be an important driver of interannual variation in AET under historic climate variability, but we emphasize that the effect of timing is clearly secondary to interannual variation in precipitation. In general, historically drier years show greater sensitivity to timing (Table 2).
 Climate warming is likely to alter the temporal synchronicity between atmospheric water demand and water availability both by shifting the timing of spring onset of higher AET demand to earlier in the year and by shifting the timing of snowmelt or shifting from rain to snow inputs. Although the magnitude of water input dominates AET variation, it is the timing of water input that may be mostly likely to change with warming. Climate model estimates for the western U.S. are highly uncertain in their projections of precipitation changes, with model estimates including both increases and decreases. Projected increases or decreases in precipitation, however, are typically less than 10% [Maurer, 2007; Cayan et al., 2008]. Trend analysis of the observed meteorological data from Sagehen shows a statistically significant increase in average annual temperature of approximately 0.4°C/decade since the 1970s but no change in annual precipitation. Model estimates of mean basin day of melt suggest that this change in temperature has lead to a small (4.5 days per decade) shift in the timing of melt.
 Our model results show that whether a warming scenario can lead to a noticeable decline or increase in AET depends on a priori timing of snowmelt, the amount of water input, vapor pressure deficit, and precipitation events during spring and summer as well as the extent of lateral moisture inputs. Our results show that changes in timing can be significant and reflect the importance of temporal synchronicity between vegetation processes and moisture availability. Evaluation of changes in AET with warming based on annual time scale metrics, such as the Budyko curve, need to be adjusted to account for the importance of this intraannual synchronicity. Other studies have used remote sensing of vegetation [Donohue et al., 2010] or estimates of mean storm size [Donohue et al., 2012] to adjust the Budyko curve to account for within-year processes in a semiarid non-snow-dominated regions. To capture the impact of snowmelt timing, models must use time scales that account for seasonal patterns of recharge. Capturing AET dynamics also requires representation of within-watershed spatial patterns of recharge, temperature and radiation forcing, and lateral moisture redistribution. Assessing how changes in snowmelt timing alter AET is complicated in mountain environments by topographic-driven variation in snow accumulation and melt as well as spatial patterns of temperature and vapor pressure controls. The sensitivity of snow accumulation and melt to warming is also highly spatially variable with areas of snow at risk, typically those where winter and spring air temperatures are currently close to 0°C [Nolin and Daly, 2006]. These spatial complexities emphasize the importance of multiple-scale analysis. Lumped watershed- or coarse-scale distributed assessments of climate impacts of vegetation are common [Boisvenue and Running, 2006; Subin et al., 2011]. Given the topographic complexity and substantial year-to-year variation in precipitation input, finer spatial-scale analysis, such as the 90 m patch-scale analysis done here, can identify areas with a substantially greater potential for increased water stress.
 Our model-based analysis also highlights the importance of considering lateral moisture subsidies in these mountain environments. The model estimated an approximately 10% greater ET when lateral moisture was included (note that calibrated results include lateral moisture redistribution). This effect of moisture redistribution is consistent with field-based observations that suggested that riparian vegetation can account for a disproportionate 20%–30% of total ET watershed ET for the semiarid region in New Mexico [Dahm et al., 2002]. A range of studies show how spatial patterns of vegetation structure and function often reflect lateral subsidies in even sporadically water-limited environments [Thompson et al., 2011; Hwang et al., 2012]. Our analysis here demonstrates that lateral redistribution influences not only average basin- and patch-scale AET but also patterns in response to climate warming. In snow-dominated systems, areas with significant lateral inputs may show differential (and sometimes greater responses to warming) due to (a) greater moisture status and (b) sensitivity to changes in snowmelt in upslope areas. We note that many hydrologic models [Day, 2009; Young et al., 2009; Maurer et al., 2010] and ecosystem models (e.g., see review by Randerson et al. ) of climate change impacts do not account for within-watershed lateral redistribution of moisture.
 The greatest changes in AET with our 3°C warming scenario were associated with shifts in timing due to changes in peak SWE in relatively wet years and locations. In these scenarios, snowmelt was a significant but vulnerable water input—either as a local water input or as water subsidy from upslope areas. The greater sensitivity of wet years/locations to 3°C warming-driven changes in peak SWE contrasts with the historic watershed-scale climate response where the effects of interannual variation in peak SWE were greatest in dry years. This distinction reflects the greater changes in snowmelt and recharge timing with warming in years or places with substantial amounts of snow. Areas with substantial lateral inputs can further show greater sensitivity to warming for two reasons. On the one hand, because lateral subsidy responds to changes in the timing of snowmelt over a larger elevation range, areas with lateral subsidy are more likely to have some areas with increased snow at risk in any given year. At the same time, additional water inputs will support increased water loss with warmer spring temperatures and higher vapor pressure deficits.
 In general, our results point to a threshold precipitation above which variation in the timing of water inputs can substantially alter annual vegetation water use. Furthermore, this threshold reflects interactions between the frequency of historic water limitation and the sensitivity of recharge (as snowmelt) to temperature change. There is also an upper precipitation and elevation threshold above which systems are temperature rather than water limited and earlier snowmelt actually leads to greater AET. In this case, the increased demand associated with the timing of peak water availability results in greater AET. Our results also parallel a commonly observed pattern of maximum heterogeneity in transpiration and AET at intermediate soil wetness conditions [Albertson and Montaldo, 2003; Tromp-van Meerveld and McDonnell, 2006]. We argue that a similar perspective is needed when examining how changes in snow-water inputs will influence vegetation water use in environments, particularly in regions, such as the mountain western U.S., that shift between water- and temperature-limited conditions. In our Sagehen Creek example, many locations shift between water and temperature limitation from year to year. In water-limited years, timing matters—however, in very water limited years, plants are already water stressed, and small shifts in timing of a small amount of water are unlikely to make a substantial difference. In very wet years/locations, even large shifts in timing are less important because at the beginning of the growing season, soils are fully saturated, and there is sufficient water to maintain ET late into the summer. Greatest responses to warming may therefore occur under intermediate wetness conditions where there is a significant amount of water that is stored in snow, in a year when that snow is vulnerable to warming, and the amount of water is not sufficient to fully recharge the soil. We emphasize that the most sensitive locations are not necessarily the driest locations in the watershed, but rather are locations that receive substantial snowmelt inputs (either directly or indirectly) through lateral inputs. Monitoring of these locations and climate change assessment must account for these finer time-spatial scales to identify forest vulnerability.
 In many of these systems, increases in AET generally correlate with increases in productivity, and vice versa [Goulden et al., 2012]. Our results highlight that under a warming climate, increased forest water stress may occur within watersheds that as a whole are temperature limited. Reduced snow accumulation and earlier melt can reduce effective soil moisture recharge and ultimately summer water availability and annual AET, even without precipitation change. Our simulation estimates show that whether forest water availability and use increase or decrease with warming is likely to vary substantially from year to year across a wide range of elevations. Large declines in AET for isolated years and locations, associated with shifts in timing, may have important implications for drought stress vulnerability. Substantial declines in productivity or drought stress-related mortality in these years could alter ecosystem structure, with implications for ecosystem health, productivity, and disturbance risk [Swetnam and Betancourt, 2010; McDowell, 2011]. We also note that forest growth dynamics can lead to time lags and feedbacks in system responses. Wetter years may lead to increased growth and, thus, a greater capacity for increased AET. Drier years may conversely lead to declines in growth, which reduce capacity for water use in subsequently dry years. Disturbance regimes (fire and insects) can also respond to and significantly alter water demand [Westerling et al., 2006; Bentz et al., 2010]. Next steps will incorporate these ecosystem dynamics into analysis similar to that presented here.
 We gratefully acknowledge support from the U.S. Geological Survey–funded Western Mountain Initiative and the National Science Foundation Sierra Critical Zone Observatory. We also thank the Sagehen Creek Field Station of University of California at Berkeley for data and support.