Corresponding author: H. Ó. Andradóttir, Faculty of Civil and Environmental Engineering, School of Engineering and Sciences, University of Iceland, Hjardarhagi 2-6, IS107 Reykjavík, Iceland. (email@example.com)
 The average length of time land-borne compounds remain within an aquatic system is one of the key parameters controlling its biochemical processes. This study explores the magnitude and sources of daily, seasonal, and interannual variability of river water residence time in the Sau Reservoir, a prototypical example of a Mediterranean water supply. Daily estimates of residence time from 1998 to 2005 were obtained from a series of tracer experiments simulated with a one-dimensional physical model, based on actual observations and synthetic scenarios. Results highlight that multiple in situ factors, both natural and managed by humans, affect the residence time in reservoirs. Simulated residence times varied on average ∼30% on a daily basis, as a result of natural meteorological (17%) and river inflow temperature (11%) variability. The management of withdrawal depths largely controlled the seasonal variations: The practice of withdrawing water from the shallowest outlet, which was close to the intrusion level of summer inflows, promoted shorter residence times in summers than winters. Interannual variability was primarily associated with the natural variability of inflow volume (25%), and secondarily with surface meteorology (8%). Excessive withdrawals prior to and during long dry periods, however, drastically reduced the reservoir storage capacity and withdrawal options, leading to shorter residence times in dry years. The flushing time, calculated as the ratio of storage volume and flow rates, captured the trends in annual mean residence times reasonably well but not daily and seasonal residence times.
 Nutrient and anthropogenic inputs via surface and groundwater flows are often the major biochemical sources to aquatic systems. The fate and impact of these land-borne compounds in reservoirs and lakes depend, to a large extent, on the physical processes of transport and mixing. The average time a conservative tracer introduced with the inflow at a given time resides in an aquatic system, hereafter referred to as “residence time,” incorporates both the mixing and transport regime, and the time available for biochemical transformations. This time scale has been frequently suggested as a major factor affecting thermal stratification, primary production, nutrient cycling, and, in general, the water quality of lakes and reservoirs [Simek et al., 2011; Monsen et al., 2002]. In spite of the biological and chemical implications of the residence time, its calculation is not without problems [Straskraba, 1999]. A widely used first order approximation of the average residence time is the “flushing” time Tf [Foy, 1992; Sivadier et al., 1994; Straskraba et al., 1995] estimated as the ratio of the average volume of water V stored in the reservoir by an average volumetric flow rate Q:
In spite of being a simple ratio of reservoir volume and flow rate, the application of this equation to managed stratified reservoirs is far from straightforward [Straskraba, 1999]. These systems are, in general, highly dynamic, being subject to seasonal and short-term changes in inflowQin and withdrawal rates Qwdr, hence in stored volumes V. Mediterranean reservoirs, in particular, experience large seasonal and interannual water level fluctuations as a consequence of (1) large variations in runoff volumes at interannual, seasonal, and daily time scales, and (2) high water demands (withdrawals) concentrated in the summer months [Naselli-Flores, 1999]. It is therefore not clear what values of Q (and V) should be entered in equation (1). It is not clear either at what temporal averaging scales (i.e., annual, monthly, biweekly) should equation (1) be appropriate. More importantly, seasonal stratification imposes severe restrictions to vertical mass transport. Under stratified conditions, inflows will form intrusions of limited vertical extent, outflows will be typically drawn from a narrow range of depths, and there might be parts of the reservoir isolated from the inflow and outflow process, hence, remaining in the reservoir for periods of time much longer than predicted from equation (1). Hence, equation (1) may not be appropriate as an estimate of the residence time of inflows in these kinds of systems [e.g., Toja, 1982; Pérez-Martínez et al. 1991].
 The need for more accurate and realistic expressions to quantify residence time in lakes and reservoirs is forcing aquatic scientists to take into account a number of factors such as the timing of stratification, the depth of the thermocline, and the variability of the inflows. This is the case of the simple continuous function developed by George and Hurley  to calculate residence time in thermally stratified lakes. The approach of George and Hurley , however, does not take into account some fundamental aspects of the physical behavior of Mediterranean reservoirs. Stratification in these systems, for example, is characterized by the existence of thick metalimnetic layers, as a result of high discharge rates from intermediate or deep outlets and high surface insolation [Casamitjana et al., 2003]. River water in these cases forms intrusions at variable depths [Rueda et al., 2006; Vidal et al., 2012] and will not flow directly into the epilimnion, as was assumed by George and Hurley . A more physically based approach to determining retention time scales in reservoirs was recently proposed by Rueda et al. , in which the effects of stratification, water level fluctuations, and changes in inflow and outflow rates were taken into account. That approach was based on the use of a physically based transport model and the definition of average residence time, as stated in standard treatises of Chemical Engineering [e.g., Rueda and Cowen, 2005]. In the results of Rueda et al. , the magnitude of the mean residence time calculated for river inflows every 5 days during a period of time of 1 year exhibited large and complex variations, as a consequence of the interaction among the physical processes of lake mixing, inflows, and selective withdrawals. Those changes in residence time occurring at yearly, seasonal, or subseasonal time scales could be important in determining the structure and functionality of the ecosystem, and the behavior of the reservoir as a chemical reactor. Padisak et al. , for example, demonstrated in Lake Müggelsee (Germany) that changing flow rates and, hence, flushing times could influence the composition of the phytoplankton assemblages in the water column, favoring the development of cyanobacteria to the detriment of other groups. Unfortunately, little is known about the variability of river water residence time in lakes and reservoirs, from subseasonal to interannual time scales, whether it is only controlled by hydrologic forcing, or if meteorological forcing also influences that variability. Also, it is not clear whether residence time variability can be modified through decisions concerning how the dam and the outlets are operated.
 The goals of this present work are (1) to characterize the time variability (from daily to interannual scales) of average residence time of river-borne substances in reservoirs; and (2) to analyze the natural and anthropogenic factors that control reservoir residence time both during low flows and extreme runoff events. The physical based modeling procedures proposed byRueda et al.  were followed to estimate residence times during a period of 8 years. This time span allowed a full exploration of the intrinsic variability of the Mediterranean climate at several temporal scales and the effects of management alternatives tailored to it. The study is organized as follows. First, the modeling results based on the observational data are analyzed to unravel the factors controlling residence time at relevant time scales. Second, the main sources of residence time variability are further explored using synthetic scenarios, in which key climate and management inputs are modified, with an extended look to the effects of withdrawal depth on seasonal evolution of residence times. Third, the residence times during extreme runoff events are analyzed in detail. Finally, implications of the simulations on reservoir water quality management are discussed qualitatively.
2.1. Site Description
 The Sau Reservoir (41°58′N, 2°22′E) is a eutrophic and monomictic canyon-shaped reservoir (Figure 1a) located in the central reach of the Ter River in northeastern Spain. It is the first in a series of reservoirs supplying water to approximately 3 million people in the metropolitan area of Barcelona. When full the reservoir stores up to 165 hm3 of water, and its average inflow rate is ∼11.4 m3 s−1. Stored volumes, inflow and outflow rates, though, undergo large changes at interannual, seasonal, and subseasonal scales due to the natural variability associated with the Mediterranean climate, and hydraulic management practices. The total length of the reservoir is 20 km. A 3.6 km long and 1.3 km wide lacustrine zone, as termed by Kimmel et al. , is located next to the dam, with a maximum depth of 60 m. Further upstream, the reservoir is narrower (maximum width of 100 m), and meandering (Figure 1a). The Ter River enters the reservoir near its western boundary. Outflows are regulated with three withdrawal outlets located at different elevations (Figure 1b). Withdrawal elevations in the Sau are selected so that the best water quality (in terms of particulate organic matter concentration and absence of reduced soluble substances) is released downstream into the Susqueda Reservoir, while water of poorer quality is retained.
2.2. Observational Data
 Meteorological and hydrological data were available for the Sau Reservoir during a 9 year period, from 1998 to 2006 (simulation period). Hourly wind speed, air temperature, precipitation, and incoming short-wave solar radiation were recorded near the dam, 70 m from the shoreline and 12 m above the maximum water surface elevation. Surface heat fluxes into the reservoir were calculated from meteorological data and simulated near surface water temperature using bulk-parameter methods, as byFischer et al. . Incoming solar radiation, air temperature, and surface heat fluxes varied strongly on a seasonal basis (Figures 2a, 2b, and 2d). For example, air temperatures (Figure 2b) ranged from 0 to 4°C in winter to >20°C in summer. In comparison, the variability between years was much smaller, seen by the thin range of the shaded area (<5°C). The warmest year was 2003 with annual mean temperature of 13.7°C (dark dashed line in Figure 2), while 2005 was the coldest, with an annual mean of 12.3°C (light dashed line). Wind speed exhibited moderate seasonal and interannual variations, as indicated by the flat heavy line and narrow shaded area in Figure 2c. Annual wind speeds ranged between 1.7 m s−1 during the windiest year (2001) and 1.4 m s−1 during the calmest (2005). The wind speeds from January through May were generally higher than during other months of the year.
 Daily inflow and outflow rates and stored volumes in the reservoir were available from the Catalan Water Agency (ACA) and Aigües Ter Llobregat (ATLL), the water supply company managing outflows from the Sau Reservoir. During the simulation period, annual inflows into the reservoir (Figure 3a, black columns) averaged 340 hm3 yr−1, with significant interannual variability (standard deviation = 120 hm3 yr−1). To further characterize the hydrological variability among years, the aridity index AI was calculated as the ratio of the mean potential evapotranspiration (water demand) to mean precipitation (water supply) for the entire watershed contributing to the reservoir, as by Arora . This index varied between 0.75 and 2 (dashed line in Figure 3a), which are typical values for subhumid regions. Only during wet years (2002 and 2003), when inflow rates were at their maximum values (Qin > 450 hm3 yr−1), did the annual supply of water exceed the demand (AI < 1). River inflows varied on seasonal and synoptic scales associated with episodic rainfall and snowmelt events during early winter and spring (Figure 3b). On average, the annual withdrawal volumes (Figure 3a, white columns) mimicked the inflow volumes (Figure 3a, black columns) within ±40 hm3. Note that during the two years with lowest inflows, 1998 and 2005, the withdrawals exceeded the inflows by 65 and 50 hm3, respectively. To account for the effects of variable inflows and withdrawals in replenishing and flushing the reservoir, and Qin ≠ Qwdr, the volumetric flow rate Q in equation (1)was taken as the average through-flow rate, defined asQ = (Qin + Qwdr)/2.
 As part of a long-term water quality monitoring program funded by ATLL, inflow temperatures, light attenuation in the water column, and 1 m vertical resolution reservoir temperature profiles were collected monthly, at noon (local time), from 1998 to 2006. A total of 169 temperature profiles were available. A regression equation was developed to estimate river water temperatures (θin) on day i from average air temperatures (θa) measured on previous days as follows:
This equation is similar to that used by Rueda et al. , and incorporates a total number of 1167 data points in its derivation (R2 = 0.95, p < 0.0001, root mean square error RMSE = 1.44°C).
2.3. Model Simulations
 Daily estimates of river water residence time were obtained as by Rueda et al. , from a series of conservative pulse tracer release experiments simulated with the dynamic reservoir simulation model (DYRESM) [Imberger and Patterson, 1981]. A process-based one-dimensional model, DYRESM includes descriptions of mixing and transport processes associated with river inflow, natural or manmade outflows, vertical diffusion in the hypolimnion, and mixed-layer dynamics; it is also used to predict the variation of water temperature and salinity with depth and time. The lake is conceptualized as a stack of horizontal layers which are free to move vertically, and to contract and expand in response to hydrologic or meteorological forcing (e.g., layers can mix together, have inflows inserted, or outflows removed). Outflows are taken from horizontal withdrawal layers of limited vertical extent centered approximately at the level of the outlet. Inflow parcels, each corresponding to a single day's inflow, remain separate from the main layer structure until they reach their level of neutral buoyancy. Once at this level, they leave the river channel and become inserted in the layer stack, mixing with the water existing in the host layer, and forming over-, under-, or interflows depending on whether the insertion level is at the surface, at the bottom, or in between, respectively. The path of each inflow parcel traveling vertically from the inflow section toward the insertion layer, and entraining ambient fluid as it progresses, is explicitly simulated in the model. Intrusions and withdrawal layers are, in general, narrow, their vertical extent depending on the flow rate and the degree of density stratification [Fischer et al. 1979]. The model has been extensively described and applied with success in the literature to simulate the vertical thermal and salinity structure a wide range of lakes [see Perroud et al., 2009, and references therein], and, in particular, the Sau Reservoir [Han et al., 2000]. The success of DYRESM to reproduce the vertical density structure of lakes and reservoirs implies that the level of process description, including temporal and spatial scales in the model, is fundamentally correct [Hamilton and Schladow, 1997].
 The model was set to simulate the thermodynamics and transport processes in the Sau Reservoir during a 9 year period. The model was forced, using observed daily meteorological and hydrological variables (see Figures 2 and 3), and the temperature predictions were compared with the temperature profiles collected in the reservoir from 1998 to 2006. This was considered a first check to make sure that the vertical position of the inflow intrusions was predicted correctly.
 Tracer experiments consisted of injecting a pulse of one conservative tracer of mass M0 on each day i of the 8 year study period. The mass of that tracer leaving the reservoir thereafter, until the end of the 9 year simulation period (day n) was monitored. Hence, 2920 simulations were conducted, each one with the conservative tracer released on a different day. The average residence time of the river water entering reservoir during day i, Ti, was then calculated as follows [Rueda et al., 2006]:
Here Cwdr,k and Qwdr,k represent tracer concentration in water and flow rates withdrawn on day k. For all the experiments conducted from 1998 to 2005 (study period), 99% of the mass released M0 was recovered at the outlet at the end of the simulation.
 The elevation of the river water intrusion in the reservoir Zin was estimated as follows. The first day after the release with simulated tracer concentrations in the reservoir above zero was initially identified. The tracer profiles on that day were then inspected to determine the elevation where the concentration was maximal. This approach, though approximate, was a good estimate of the intrusion elevation, given the limited vertical extent of the intrusions under stratified conditions.
 The stability of the water column was characterized using lake numbers LN defined as the ratio between the stabilizing moments due to the stratification and the destabilizing moments associated with wind forcing, both of them referring to the center of volume. Low lake numbers (LN < 1) are indicative of unstable water columns. High lake numbers, in turn, are indicative of stable water columns. Lake number values were calculated from daily simulated reservoir temperatures and observed daily averaged winds following Stevens and Imberger . In these calculations, the elevation of the thermocline ZT was defined as the center of the metalimnion, taken as the layer where the buoyancy frequency exceeded a threshold value of 10−3 s−2 [Hoyer et al., 2009].
 To explore the impact of management practices and natural meteorological and hydrological variability on daily residence time in the Sau Reservoir, a range of synthetic scenarios were simulated (see Table 1). We refer to the base case run as scenario 0, in which the simulations were forced with observations. In scenario 1, withdrawals were presumed to occur exclusively through the bottom outlet. In scenario 2, in addition, water level fluctuations were limited by setting the withdrawal rates from the bottom outlet equal to the observed inflow rates. Four additional subscenarios (2a–2d) were simulated with equal inflow and outflow rates to address the impact of varying withdrawal elevations (see Table 1). The tracers in these additional subscenarios were released every 5 days, and not daily as for the other scenarios. In scenario 3, the inflow and withdrawal rates were assumed constant. In the last scenario (scenario 4), river temperatures θin (equation (2)) were low-pass filtered (cutoff frequency of 1/60 days), leaving meteorological factors (Figure 2) as the sole source for short-term (daily and synoptic) variability.
Table 1. Description of Scenarios Modeled With DYRESM, Showing Mean (Min, Max) of Daily Input Parameters During 8 Year Study Period (1998–2005)
Zwdr (m a.s.l.)
Qwdr (hm3 day−1)
Qin (hm3 day−1)
Mean values during the 9 year simulation period (1998–2006).
Reservoir volume representing initial conditions (day one of base case scenario).
Full natural variability and hydraulic management (base case)
Short-term river temperature (and hence intrusion depth)
2.4. Model Simulations Quality Checks
 Even though longitudinal transport is not explicitly simulated in DYRESM [Hocking and Patterson, 1991], the daily estimates of mean residence times T are still expected to be valid, as long as the travel times of intrusions across the length of the reservoir TL are short compared to the vertical travel times of the insertion layers in the reservoir from their initial position to the withdrawal depth, hence, if TL ≪ T. Only when the intrusion forms at the withdrawal elevation can it be expected that T ∼ TL. But even in that case, our estimates of Tare reasonable given that the insertion layer will behave as a continuous reactor, in which the mean residence times do not depend on whether mixing is assumed to be infinitely fast as in DYRESM (or in other one-dimensional models), or if the motion is purely advective, as represented in higher-dimensional models.
 Longitudinal travel times were calculated, a posteriori, and compared to T as a form of quality check. These estimates of TLdiffer depending on whether river inflows form over-, inter-, or underflows when entering the reservoir. An overflowing parcel can travel to the dam in approximatelyTL∼ 9 days based on average wind-driven surface currents of 0.025 m s−1 estimated in the top 5 m of the Sau Reservoir [Marcé et al., 2007]. An interflowing parcel travels longitudinally as a result of a balance between inertial and buoyancy forces [Ford and Johnson, 1986], at a speed uL which can be estimated as follows [Fischer et al., 1979]:
Here Bi, Qi, and Ni represent the width of the lake, the flow rate, and the buoyancy frequency at the level of the insertion. Travel times as intrusions were calculated based on average dimensions of the reservoir, observed inflow rates, and simulated buoyancy frequencies at the level of insertion. Finally, underflowing parcels will reach the dam wall as gravity currents, and their travel time from the inflow sections to the dam is explicitly resolved in DYRESM.
2.5. Analysis of Variability
 The variability of residence times (and that of all other variables used in this work) were assessed on three different time scales (interannual, seasonal, and short-term) using the following approach, which is based on the method originally proposed byFeng and Qingcun  to characterize interannual variability and seasonality of wind fields. The value of the variable X at a given time is identified as Xij, where the index j identifies the year, and i the day of the year. The average value of Xij during the year j, X•j, is calculated as
Here nj is the number of days in year j. The mean of the variable during the study period was estimated by averaging equation (5) for all years N in the study period, as
 To analyze time variability of the variable Xij, a 30 day moving average was first calculated, and is denoted as . These filtered values were then averaged among years to get the interannual average of smoothed values (Figure 2, heavy lines), as follows:
These averaged values represent the evolution of variable X in a typical year. The interannual variability is referred to as ΔXy and was characterized as the averaged standard deviation of filtered values from their interannual mean in equation (7), i.e.,
Note that this approach quantifies the magnitude of interannual variability and not differences between years. Hence it does not establish whether some years have high values or low values.
 Seasonality, δXs, was defined by Feng and Qingcun  as the mean difference between summer (June–August) and winter (December–February) months:
Seasonality, as evaluated in equation (9), can be either positive or negative, depending upon whether the summer values were higher or lower, respectively, than winter values. Building upon this approach, seasonal variability was characterized as the average deviation between summer and winter, i.e.,
The benefit of this definition over, for example, that for the standard deviation of the interannual mean , is that it considers only the variability between summer and winter and discards any random monthly variability.
 Finally, the short-term (daily and synoptic) variability ΔXd was characterized as the mean standard deviation of the daily simulated values from 30 day moving averages as
These dimensional measures of variability (equations (8), (10), and (11)) can be normalized further against the averaged value of the variable during the whole study period . In their normalized form (Δ , Δ , and Δ ), they variables were used to compare different management scenarios.
3.1. Base Case Simulations
 The results of the base case simulations are shown in Figure 4. Simulated water level elevations varied considerably with time, e.g., from 50 m a.b. in 1998 to 30 m a.b. in 1999 (Figure 4a), consistent with the observations. The temperature predictions (shading, Figure 4a) were also consistent with the observations, with an RMSE of 1.56°C, which was similar to that found in other long-term temperature modeling studies using DYRESM [Gal et al., 2003; Perroud et al., 2009]. RMSE, though, tends to be larger in the surface layer during summer months (1.95°C), a result that has been attributed to limitations of the heat transport model in DYRESM [Perroud et al., 2009]. The lake stratified in early March and remained stratified until late in the year, overturning in November or December, as previously reported by Armengol et al. . Lake number values LN in summer are high and of the order O(60), and low, LN of O(10−1), in winter.
 The intrusion elevation Zint of the Ter River in the Sau Reservoir was consistently deeper in fall and winter and shallower in summer (Figure 4b). These simulated seasonal changes in the river intrusion level are in accordance with the field observations reported by Marcé et al. [2008a] and are the result of the larger thermal inertia of the reservoir, compared with the river inflow [Andradottir and Nepf, 2000]. The withdrawal elevation also changes depending on the season, following water quality management decisions. Note that the uppermost outlet (49.5 m a.b.; Figure 1b) was never used during the 8 year study period: 81% of the time it was above the free surface, and when it was submerged it was, on average, only 3 m below the free surface and within the surface mixed layer, where algal biomass tends to accumulate [Marcé et al., 2007]. Withdrawing water so close to the surface did not comply with the water manager's goal of supplying the highest quality water to the reservoirs downstream of the Sau. Water was mainly withdrawn from the middle outlet when water levels were sufficiently high, to avoid oxygen deficient hypolimnetic waters. Only when the water level dropped near or below the intermediate outlet was water withdrawn from the bottom gate.
 Simulated mean residence times T for river substances entering the Sau each day are shown in Figure 4c. The average residence time during the 8 year study period T•• was 94 days. In comparison, the daily estimates of the longitudinal travel time TL from equation (4) were approximately 8 days, varying from 2 to 20 days (5 and 95 percentile, respectively) depending on the reservoir inflow rates and stratification (Figures 3b and 4a). These estimates are consistent with the three-dimensional (3-D) modeling results ofVidal et al.  in the Sau, which showed inflow intrusions reaching the dam 7 days after their entering the reservoir, at times with inflow rates of O(1) m3 s−1. On average, the estimates of TL represent only 12% of the simulated mean residence times T, supporting the validity of the results shown in Figure 4c. The residence time experienced large oscillations around the simple average. There were days for which T ≪ T••, suggesting that river-borne substances travelled through the reservoir with limited vertical mixing. These “short-circuiting” events occurred in fall (September–October) and, sometimes, in spring (April–May), typically when withdrawal levels closely matched the intrusion depth of the river (Figure 4b). By contrast, there were days for which T ≫ T••, hinting that river-borne substances were not effectively flushed through the reservoir volume but rather stored in so-called dead zones. This condition, referred to as trapping, typically occurred during late winter in the Sau Reservoir, just prior to the onset of stratification (February–March). The Ter River plunged all the way to the reservoir bottom at that time. Thereafter, in spring and summer, river intrusions formed closer to the surface and stratification inhibited the flushing of deep water. Hence, river water entering during late winter, forming deep intrusions, was trapped in the hypolimnion, and, consequently, had long residence times. These deep intrusions are useful in that they inject oxygen directly in the hypolimnion, and help maintain deep layers far from anoxia during the first months of the stratification period [Armengol et al., 1999; Marcé et al., 2008a, 2008b].
 Residence times also varied at interannual and seasonal time scales. The year with the shortest mean residence time (T•j = 69 days) was 1999. This was also the year with the lowest flushing time (Tf= 56 days), as a consequence of low reservoir storage volumes and medium through-flows (seeequation (1) and Figures 3a and 4a). Furthermore, river intrusions formed close to outlet elevations during a large portion of this year (Figure 4b). The year with the longest mean residence time (T•j= 130 days) was 2004, which was also the year with the largest storage volumes, through-flows, and vertical distances between river intrusion and reservoir outlet. There was, in fact, a weak (R2 = 0.3) relationship between the AI and the yearly averaged residence times. Seasonally, residence times tended to be longer in winter and shorter in summer, especially in those years when withdrawals occurred at similar elevations as the river intrusion (1999–2002 and 2005; Figure 4b).
3.2. Factors Controlling Residence Times and Relevant Time Scales
Rueda et al. argued that the transport pathways of river-borne substances toward the outlet of a reservoir, and hence residence timeT, depended on three factors: (1) elevation of inflow intrusions relative to the withdrawal elevation; (2) hydrologic history after release, which refers to reservoir volumes, inflows, and outflows; and (3) vertical stability/mixing after release, generally parameterized in terms of the lake number LN. A linear correlation analysis was conducted to relate the daily estimates of residence times calculated from the simulations (equation (3) and Figure 4c), and the values of factors (1)–(3), on a daily basis and on 10, 20, 30, up to 100 day averages following the river entry. The major results of the correlation analysis are presented in Table 2. The highest correlation was found between the residence time and the vertical separation between the river inflow intrusions and the outflow gate at the day of river entry (R2 ≈ 0.3), highlighting the importance of withdrawal elevation strategy in managed reservoirs. The scattered points along the linear dashed line in Figure 5a support the use of linear regression analysis. The average slope of this regression line was 4 days for every vertical meter of separation.
Table 2. Selected Summary of Regression Analysis Between Simulated Daily Residence Times (Base Case) and Controlling Factors Over Various Time Scales Over 8 Year Study Perioda
Regression R2 With Daily T
Moving Averages (days)
n = 2920
Adjusted for periods with no through-flow, for which 1/Q goes to infinity.
 Among the hydrological factors analyzed, the flushing time calculated as the ratio of average storage volumes and through-flows during the 30–80 days after river entry correlated best with daily residence times (R2 ≥ 0.2; Table 2). This suggests that the hydrological history at time scales of 30–80 days after river entry, corresponding to 30%–75% of the average residence time T•• = 94 days, is the most relevant in determining daily residence time. The correlation between residence time and independent hydrological components such as reservoir volume, inflow, and outflow rates was considerably lower (R2 ≤ 0.1; Table 2). Figure 5b plots the simulated residence times as a function of the flushing time 40 days after entry Tf-40d. The points align around the 1:1 slope near the intercept but deviate afterwards into two opposing directions, corresponding to the anomalies of short-circuiting (T ≪ Tf) and trapping (T ≫ Tf). Excluding these anomalies, the best first order correlation for this data set is considered to be linear. The calculated linear slope through the entire data set, forcing the intercept to zero (Figure 5b, light dashed line), is less than one, suggesting that the flushing time, as defined by equation (1), may slightly overestimate the daily residence times of river water in the Sau Reservoir. This may be linked to the thermal stratification in summer. Figure 5c supports that summer stability, represented as the natural logarithm of the lake number greater than one, serves to promote shorter residence times. The highest linear correlation (R2 = 0.14) is found between simulated residence times and the water column stability evaluated 30–40 days after the release (Table 2).
 Lastly, given that the daily residence times were mostly correlated with relative elevation of inflows to outflows and hydrological history, the correlations of residence time with two additional physical parameters capturing both processes were considered. First, the effective transport volume of river in the reservoir as defined by the volume bounded by the depths of river intrusion and withdrawal Veff, and second, the effective flushing time Teff calculated as Veffdivided by through-flowsQ. Results show that daily residence times were less correlated with these physical parameters (R2 ≤ 0.2; Table 2). Therefore, the alternative approach was taken to conduct a linear multiple-regression analysis of the three most relevant factors identified, i.e.,
This approach yielded the highest correlation (R2 = 0.47; Figure 5d). The scattering of the points, even if one excludes outliers and extreme values, in the multiple-regression plot, as well as the other three plots inFigure 5 (R2 < 0.5 in all cases), highlights the complexity of determining residence time in managed reservoirs.
3.3. Sources of Variability in Residence Times
 Variability in reservoir residence time results from in situ factors, which can both be natural and managed by humans. The dimensional variability in key climate and management inputs during base case (Table 3, right side) suggests that meteorological parameters vary mostly seasonally (e.g., solar radiation and air temperature) and/or daily (e.g., wind speeds), and only to a limited extent interannually (i.e., ΔXs, ΔXd ≫ ΔXy). River inflows and withdrawal volumes, however, vary mostly on a daily basis, then interannually, and least on a seasonal basis (ΔXd > ΔXy > ΔXs). To characterize the sources of interannual, seasonal, and daily variability in residence times in the Sau Reservoir, four synthetic scenarios (Table 1) were run in DYRESM. Simulated residence times for each scenario are shown in Figure 6 and their normalized variability summarized in the left hand side of Table 3.
Table 3. Analysis of Interannual (ΔXy), Seasonal (ΔXs), and Short-Term (ΔXd) Variability in Key DYRESM Model Input and Output Parameters, as Defined by Equations (8), (10), and (11)a
Simulated Residence Time T (Various Scenarios)
Model Input Parameters (Scenario 0)
T•• 1998–2005 (days)
Normalized Variability (%)
The symbols and procedures are explained in the text
Partial hydraulic management
Limited hydraulic management
No hydrologic variability
Limited river temperature variability
 First consider synthetic scenario 1 (Figure 6b), in which withdrawals have been limited to the bottom outlet, the only existing outlet that could be used for the entire simulation period due to the large variations in free surface elevation (Figure 4b). Interannual and short-term variability was similar in magnitude (+3% and −3% respectively;Table 3, left side) to the base case scenario (Figure 6a). Seasonal trends were, however, slightly changed. Specifically, during the wet years 2002–2005, larger values of T tended to occur in summer (Figure 6b), contrary to the base case, when they occurred in winter (Figure 6a). This is better reflected by the sign reversal in normalized seasonal variability between scenarios 1 (+9%) and 0 (−24%) in Table 3. The greater vertical separation distance between river intrusion and withdrawal levels for scenario 1 could explain these seasonal changes. These results are indicative that the management of withdrawal outlets may significantly affect the seasonal evolution in daily residence times in a reservoir, while there is only minimal impact on the interannual and short-term variability.
 In scenario 2 (Figure 6c), withdrawal rates were set equal to inflow rates at all times so that the volume of water stored in the reservoir remained nearly constant. In the absence of hydraulic management, the longest residence times occurred more systematically in summer each year, also reflected in the higher normalized seasonal variability (25% in scenario 2, compared to 9% in scenario 1; see Table 3). The normalized interannual and short-term variability remained similar to those for scenarios 0 and 1. The simulated residence times were, in general, shorter during the wet years of 2002 and 2003 than in the other years. This was contrary to scenarios 0 and 1 (Figures 6a and 6b), for which simulated residence times tended to be longer during the wet years. Hence, the management of withdrawal volumes, which in turn largely governs reservoir storage volume, contributed to longer residence times in wet years and shorter residence times in dry years in the Sau Reservoir.
 Setting hydrological parameters as constant (scenario 3) produced a significant drop in the normalized interannual variability, from >30% to 8% (Figure 6d and Table 3). This suggests that the primary source of interannual variability in reservoir residence times (25%) is associated with hydrological inputs to the reservoir, and secondarily (8%) with meteorological inputs. This agrees with findings that the interannual variability is more pronounced within hydrological than meteorological factors (Table 3, right side). By contrast, the seasonal variability of residence times in this scenario was more pronounced that in the previous scenarios (37%, Table 3). This can be partially explained by the higher reservoir volumes in this scenario (Table 1) and by the highlighted importance of seasonal meteorological variability (Table 3, right side) in the absence of hydrological variability. Normalized short-term variability was unchanged (∼30%).
 Lastly, filtering out the short-term variability in river water temperatures (scenario 4) drastically reduced the day-to-day variations in residence times, seen by the limited scatter of data points onFigure 6e, and an 11% drop in normalized short-term variability, from 28% to 17% (Table 3). This suggests that 11% of the short-term variability in residence times is driven by variations in river temperature, an important factor determining the river intrusion depth (Figure 4b). The remaining 17% are associated with surface meteorological factors like wind, air temperature, and solar radiation. Interannual and seasonal variability were similar, however, in scenarios 3 and 4.
3.4. Effect of Withdrawal Elevation on Seasonal Evolution of Residence Time
 The results from section 3.3 suggest that the management of withdrawal elevations is the main factor governing the seasonal evolution of residence times. Simulations of scenario 2 (constant reservoir volumes Qin = Qwdr; Table 1) with different withdrawal elevations confirmed this (Figure 7): Shallow withdrawals (scenario 2d) coincided with river intrusion elevation in summer, yielding minimal Zin − Zwdr (Figure 7a) and short summer residence times (Figure 7b). The difference between summer and winter residence time was considerable, reflected in the high negative normalized seasonal variability of Δ = −48%. By contrast, withdrawals from the bottom and below the middle outlets (scenarios 2 and 2a) produced minimal Zin − Zwdr, and short residence times, in winter. The normalized seasonal variability was positive and more moderate (Δ = 10% − 25%). The simulations also suggest that withdrawal elevations impact reservoir water budgets and river intrusion dynamics: Shallow withdrawals (scenario 2d) resulted in 3% higher reservoir storage volume, 0.7 m higher water level, and river intrusions that are 6 m closer to the free surface than when water was withdrawn from the bottom outlet (scenario 2). These differences were not driven by hydrological variables, which were identical for all scenarios considered. Instead, they are a result of reduced evaporation as well as a shallower and more stable thermocline promoted by shallow withdrawals in the Sau Reservoir [Moreno-Ostos et al., 2008].
3.5. Extreme Runoff Events
 The hydrologic regime in the Sau is largely driven by discrete rainfall or snowmelt events. Prevailing conditions during the eight largest inflow events in 1998–2005 (numbered in Figure 3b) are summarized in Table 4. Each discrete event lasted for several days and represented up to 22% of the annual inflow during the year in which it occurred. Four of the runoff events were rainfall driven in fall (October through December), and the remaining four were spring rainfall and snowmelt (end of February through May). A considerable range in intrusion depth and daily residence time was simulated over the duration of each event, as well as between events. During seven events out of eight, residence times ≤20 days were simulated at least one time, indicating some short circuiting of riverine water (i.e., Tmin ≪ T•• = 94 days; Table 4). To get representative values at the event scales, weighted averages using daily inflow as the weighting variable were calculated (Table 4). Inflow-weighted averageT showed strong positive correlation with the water volume accumulated in the reservoir during the storm event, represented by Qin− Qwdr (R2 = 0.68, p < 0.05, n= 8), while neither the inflow-weighted averageZint – Zwdrnor the inflow-weighted averageLN showed significant association (R2 < 0.07, p > 0.50, n = 8). These relationships imply that if water is accumulated or stored during the storm event (i.e., the outflow is restricted and there is enough free volume to store the incoming water) the average residence time T during the storm will increase, with an apparently minor role of other variables like Zint− Zwdr or LN. This result suggests that the residence time is heavily influenced by the role of the reservoir as a storage system in a particular event.
Table 4. Summary of DYRESM Base Case Simulations During Extreme Inflow Events in 1998–2005
 A more subtle behavior becomes apparent when analyzing residence on a daily scale (i.e., without averaging for the entire event). When the inflow and outflow were similar (i.e., no large water accumulation), the plot of T versus Zint− Zwdr showed a crisp relationship around the 1:4 linear dependence similar to that found for the entire data set (Figure 8a; see also Figure 5a). But when outflows were curtailed and the reservoir acted as a storage system, this relationship collapsed (Figure 8b). Large values of T can occur under such circumstances, even during days with near zero values of Zint− Zwdr. All in all, these results stress again the effect of water storage on residence time during storms, but also highlight the strong short circuiting that may occur if Zint− Zwdr is small and outflows and inflows are of similar magnitude.
4.1. Theoretical Flushing Times as Indicators of Residence Times
 The theoretical flushing time Tf, defined as in equation (1), is the most commonly used method to characterize the average residence time of river water in lakes and reservoirs. This approach is straightforward as it can be easily determined from readily available observations, but, as revealed in Figure 5b and Table 2, it does not provide a valid estimate of the residence times on a daily basis. Other factors, such as relative elevation of river inflows and water column stability, need to be taken into account to explain changes in daily residence times during nonstorm events. Determining these factors requires, though, that river and reservoir temperatures, as well as the intrusion dynamics, are well characterized. Furthermore, even if these factors are well known, one cannot predict the residence times on short-term basis from their values, as revealed, for example, by the scattering of the data represented inFigure 5.
 The value of the theoretical flushing time Tfmay change drastically in managed reservoirs, depending on the scale of interest, both due to large variations in storage volume and hydrological through-flows. Reservoirs are not vertically mixed at seasonal or subseasonal scales since stratification inhibits vertical mixing. Hence,equation (1)is not valid at those scales since it assumes perfect mixing. However, annual flushing times, determined as the ratio of annual average storage volume and through-flow rate, differed on average only 22% (range 2% to 42%,n = 8) from the annual average simulated residence times (Figure 9). A closer relationship was found between the annual theoretical flushing and inflow-weighted simulated residence times, with deviations averaging 13% (range 1% to 44%,n = 8). To conclude, Figure 9 suggests that a theoretical flushing time based on annual (or multiannual) hydrological history may be a good first indicator of the actual time river inflows reside in a reservoir.
4.2. Hydraulic Management Practices During Low Inflows
 The Sau is the first of a series of reservoirs that supply water to the city of Barcelona. Current management practices in the Sau are aimed at protecting the water quality in the downstream reservoirs. This is achieved through withdrawals from intermediate levels, avoiding surface waters with large algal concentrations, and the deeper waters, which are depleted of oxygen. In most years, the withdrawal depths have coincided quite well with the river intrusions in summer, resulting in moderate summer residence times (Figures 4b and 4c). If the main management goal is to optimize the long-term reservoir water quality in the Sau irrespective of downstream systems, more withdrawals closer to the bottom may be more appropriate. They could reduce the risk of eutrophication by promoting the renewal of bottom waters and fast flushing of sediment redissolved compounds and materials settling from the epilimnion. However, it has been argued that deep withdrawals would prolong summer residence times in the epilimnion, which might promote the occurrence of cyanobacteria blooms [De Hoyos and Comín, 1999], suggesting that surface withdrawals may be convenient during summer. Benefits from surface withdrawals during blooms have been associated with reductions of epilimnetic residence time below algal doubling rate, thus decreasing algal concentration [Moreno-Ostos et al., 2008]. This would also reduce the amount of organic matter reaching sediments and thus less nutrients would be available for redissolution the following years [Toja et al., 1982; Toja et al., 1992]. However, this is controversial because the dynamics of the phytoplankton community are not exclusively dependent on the relationship between water residence time and algal doubling time since deeper withdrawals would also deepen the mixed surface layer, potentially counteracting any positive feedback between cyanobacteria, algae, and residence time [Paerl and Huisman, 2008].
 Another point to consider is that epilimnetic withdrawals could strengthen the thermal stability of the water. This could potentially lead to reduced entrainment rates of hypolimnetic nutrients into the epilimnion during summer [Moreno-Ostos et al., 2008]. However, Rigosi and Rueda  found the opposite, that is, that epilimnetic withdrawals might lead to increasing entrainment rates across the thermocline. Considering the above contradictory findings it becomes evident that a complete assessment of the best withdrawal strategy to mitigate eutrophication in a reservoir needs to account for many variables (phytoplankton and zooplankton biomass, anoxia development, transparency changes, etc.) which are beyond the scope of this present work. However, the suitability of alternating episodic surface withdrawals during blooms with more sustained bottom ones should be investigated.
 The residence time in the Sau exhibited significant interannual variability (Figure 9 and Table 3) which is affected by withdrawal volume strategy set by the water quality managers. In particular, the extensive withdrawals in 1998 had severe long-term consequences as they lowered the water elevation in the reservoir to the point where only the bottom outlet could be used over a 1.5 year period (Figure 4b). With only one operative outlet, the ability to control the water quality supplied to downstream reservoirs was curtailed. In particular, during this period river water was short circuited because withdrawals occurred at a similar elevation as river intrusions. This lack of withdrawal elevation flexibility in conjunction with the reduced reservoir storage capacity and the need for continuous water supply lowered the reservoir residence time. As residence time is a measure of the time land-borne nutrients and pollutants are stored and processed in the reservoir, this shorter residence time reduced the capacity of the reservoir to act as a sink and hence improving downstream water quality. These observations raise interesting questions regarding the need to include long-term hydrologic forecasts in the management of reservoirs, or the need to use higher storage thresholds during dry years to start reducing outflows, or whether dams should include a larger number of outlets (as also discussed byMarcé et al. ), thus increasing the number of options existing when choosing the withdrawal elevation.
4.3. Management During Extreme Runoff Events
 A significant fraction of the annual inflows in Mediterranean regions enters reservoirs during discrete runoff events. Consequently, mass fluxes of nutrients and land-borne pollutants to reservoirs also peak during these events. If the goal is to protect downstream water bodies by limiting mass fluxes of potentially harmful substances, short circuiting must be avoided. Our analysis suggests that storing the inflows during the flood events (i.e., minimizing the withdrawals) can be a simple option to promote longer residence times, enhancing the probabilities of river-borne substances being bio-geochemically transformed and settling in the reservoir. The frequency distribution of strong inflow events in Mediterranean areas favors this strategy since storms do not often occur and usually follow an extended period of low inflows that keeps the water level in reservoirs rather low and thus allowing plenty of room for storm management. However, if the storm event is great relative to the storage capacity of a given reservoir or several storms concatenate, management of withdrawal quantities may not be an option. In these cases, increasingZint− Zwdr in either sign can be an excellent alternative to increase residence time, but the reservoir must include the engineered structures needed to apply such a strategy (i.e., more than a single outlet). If the goal is to decrease river water residence time in the reservoir, the opposite practice of avoiding water storage during the storm while minimizing Zint− Zwdr should be considered.
 1. The residence time of river water in a managed monomictic reservoir varies on a daily, seasonal, and interannual scales. Using a conservative tracer, the transport and mixing of river water through a reservoir was simulated with DYRESM and the mean residence times calculated as the first moment of the tracer fluxes at the reservoir outlets. Daily variability of residence times was calculated as average deviations from seasonal means. Seasonal variations were quantified as half the range between summer and winter months, and lastly, interannual variations as the average deviations between individual years. The sources of variability in residence times were studied through simulations in which outflow rates and elevations, inflow rates, and river temperatures were made constant, one at a time.
 2. Daily variations in residence time represented approximately 30% of the interannual mean residence time, and were predominantly associated with meteorological (17%) and river temperature (11%) variability. Unusually long residence times (>180 days) resulted for January and February, corresponding to the trapping of that Ter River water in hypolimnion just prior to the onset of stratification, after which river intrusions formed closer to the surface. Short-circuiting, or days with extremely short residence times (<10 days), occurred in fall (and spring) when reservoir withdrawals closely matched the intrusion depth of the river.
 3. Residence times exhibited seasonal variations as a result of (a) strong monthly variations in river intrusion elevations and (b) the management of withdrawal depths. The management strategy followed in the Sau, consisting of withdrawing water from the closest outlet to the free surface, is aimed at avoiding deep and poor-quality water from the hypolimnion in summer which acts as a sink of nutrients and reduced components in the reservoir. The overall result of this strategy is the better water quality of downstream reservoirs compared to the Sau. This strategy also affects the residence times of river water in the Sau, generally reducing the residence times in summer when river enters close to the reservoir surface, while prolonging them in winter. In contrast, deep withdrawals promote long residence times in summer. For Mediterranean reservoirs experiencing large seasonal and interannual water level fluctuations, this finding highlights the need for flexibility in withdrawal elevations, e.g., in the form of a large number of outlets at the dam.
 4. Residence times in the Sau Reservoir varied on average approximately 30% between years. The sources for this interannual variability were primarily the natural variability in inflowing volumes (25%) and surface meteorological forcing (8%). The management of withdrawal volumes, and hence storage volumes, however, was found to alter trends in annual residence times: Shorter residence times occurred during dry years as a result of extensive withdrawals which limited the reservoir storage capacity and withdrawal elevations options.
 5. Residence times during extreme runoff events showed considerable variability. The most important factor affecting the residence time was the difference between inflow and withdrawal volumes. Reducing the withdrawal volumes during extreme events in fall and spring was introduced as a potential tool to mediate the risk of short-circuiting and hence mass fluxes to downstream water bodies. If available water storage capacity was limited during the storm, strong short circuiting might occur. In this case, maximizing the difference between insertion depth and withdrawal depth is an alternative management option.
 6. The flushing time Tf, defined as in equation (1), is a straightforward, commonly used method to characterize residence time. This parameter represented reasonably well annual and interannual mean residence times in the Sau Reservoir but fell short in describing daily and seasonal variations. The latter required taking into account in situ reservoir characteristics, such as the elevation of river inflows as they enter the reservoir as well as the hydrological history and water column stability in the following month. None of this information was readily available and required a thorough understanding of the system and preferably a running reservoir model that had been validated with field observations.
net heat flux into the reservoir, W m−2.
inflow volumes, hm3 day−1.
withdrawal volumes, hm3 day−1.
throughflow volumes (Qin + Qwdr)/2, hm3 day−1.
reservoir flushing time, days.
daily simulated residence time, days.
effective flushing time, days.
longitudinal travel time, days.
air temperature, °C.
river water temperature, °C.
solar radiation, W m−2.
reservoir storage volume, hm3.
effective volume associated with river transport, hm3.
wind speed at 10 m above surface, m s−1.
elevation in reservoir, m a.b./a.s.l.
intrusion elevation of inflowing river water, m a.b./a.s.l.
withdrawal elevation, m a.b./a.s.l.
thermocline elevation, m a.b./a.s.l.
interannual variability of variable X, assessed as in equation (8).
seasonal scale variability of variable X, assessed as in equation (10).
subseasonal scale (daily) variability of variable X, assessed as in equation (11).
 Part of this work was conducted while the first author was visiting the University of Granada, funded by the regional government of Andalucía (Ayudas Individuales, Junta de Andalucía) and through Plan Propio of the University of Granada. The Spanish government also provided financial support for this study with the CGL2004-05503-C02-01, CGL2008-06101, CGL2008-06377-C02-01 projects. This is a contribution of the Consolider-Ingenio 2010-SCARCE project, funded by the Spanish Ministry of Economy and Competitiveness (CSD2009–00065). We are grateful to the Centre of Water Research, University of Western Australia, for making the DYRESM-CAEDYM model available. We thank everyone involved in the field work of the Sau Reservoir, and the ATLL Water Supply Company for funding the long-term monitoring.