Base flow, also known as low flow or drought flow, in a natural river system originates mainly from drainage of the riparian aquifers in the upstream basin. Base flow data can therefore provide a quantitative estimate of the basin-scale groundwater storage changes that have taken place over the period of record. The concept was implemented and validated with data from two large river basins in Illinois. On average over the past 2/3 century, shallow groundwater storage in Illinois, derived from the drought flows, has been increasing at an average rate of around 0.05 to 0.10 mm a-1. However, the trend has not been stationary throughout this period; for instance, the more recent trend from the mid nineteen nineties until now appears to have been negative. The groundwater storage changes inferred from the base flows are generally consistent with the average groundwater level changes measured in an observation well network over the same area.
 Changes in underground water storage constitute a critical component of the terrestrial water budget, over a wide range of spatial and temporal scales with broad ramifications in human activities related to environmental quality monitoring, water resources development and general climate studies. As of yet, despite its importance, very few reliable records are available of underground water storage that are long or comprehensive enough to allow meaningful diagnoses for water resource planning or climate change purposes.
 To remedy this void, in this paper a method is proposed to infer long term shallow groundwater storage changes from streamflow records. In the absence of precipitation the observed flow in a natural stream is fed nearly entirely by outflows from the upstream riparian aquifers in the basin; these, in turn, are directly controlled by the water storage levels in these aquifers. In what follows first the principle of the method is explained in detail. This is then implemented and tested by means of data observed in two river basins in the State of Illinois, one of the few areas for which long term groundwater level measurements are available.
2. The Relationship Between Groundwater Storage and River Outflow From a Basin
2.1. Underlying Concepts
 Physical considerations, mostly based on hydraulic groundwater theory, suggest that in many situations of interest the total groundwater storage in a basin can be approximated as a power function of flow rate at the basin outlet [e.g., Brutsaert and Nieber, 1977; Brutsaert, 2005; Rupp and Selker, 2006], namely
where y = Q/A is the rate of flow in the stream per unit of catchment area, Q is the volumetric flow rate, A is the area of the catchment, S is the volume of water per unit area stored in the upstream aquifers above the zero flow level, and Kn and m are (constant) parameters, depending on the physical characteristics of the basin in question; the parameter m ranges typically between 0.5 and 1.0. While reasonable and conceptually appealing arguments can be made in support of several of the theoretical formulations, it is still not easy to determine m in an objective way from invariably noisy streamflow observations and no reliable methods are available for this purpose in the case of large basins. Moreover, as will be shown next, the value of m = 1 leads to an exponential decay function for the flow rate, and indeed many analyses of field data have indicated that in the range of the lowest flows, the flow recession can usually be fitted by this type of function; as a result, the exponential function is still the most commonly used today. This means that the parameter m can be assumed to approach one as the flow rate becomes small, and that in actual practice usually robust results can be obtained by adopting the value m = 1. Thus in many hydrologic applications related to drought flows (1) can be conveniently simplified to
in which K is the characteristic timescale of the catchment drainage process, also commonly referred to as the storage coefficient; the quantity 0.693 K is the storage half-life of the basin. To estimate terrestrial water storage in a basin from streamflow measurements it is necessary to determine this timescale.
2.2. The Characteristic Drainage Timescale
 In the absence of recent precipitation and other input events, the water flowing in a river or other type of stream comes mainly from the drainage of the upstream water-bearing geological formations. For such low flow or drought flow conditions, the conservation of mass equation can be written as
where t is the time. Substitution of (2) in (3) yields
whose solution is
if y0 is the outflow rate at the arbitrarily selected time origin.
 The application of the linear storage function (2) appears to have originated in the context of base flow separation from storm runoff by means of (5) [e.g., Barnes, 1939]; the standard procedure was tantamount to plotting streamflow data from episodes of base flow recession on semi-logarithmic graph paper, and fitting a straight line to the lower points to determine K. Many other methods have been proposed in the past [e.g., Hall, 1968; Brutsaert and Nieber, 1977; Nathan and McMahon, 1990]. However, nearly all of them have some speculative and arbitrary aspects in their application, so that the objective determination of K remains a difficult problem. The main reason for this is undoubtedly the fact that (5) and the underlying assumptions are too idealized and therefore incapable of encompassing all the intricacies of the drainage processes in natural catchments. Another difficulty is that, in any procedure to estimate K, drought flows must be identified by carefully scrutinizing concurrent precipitation records, to ensure that no storm runoff is present and that the streamflow results only from upstream groundwater outflows. Unfortunately, this requirement is not easy to satisfy especially in large river basins, because the raingage network density is never sufficient to capture all events everywhere in the basin. This requirement has been overlooked in many published analyses. Finally, low flows, especially when close to zero, are normally subject to unavoidable measurement error and possibly other uncertainties.
2.3. Typical Magnitude and Variability of the Drainage Timescale
 On account of the difficulties surrounding the objective estimation of K by any of the available methods, it is necessary to take a closer look at its physical nature. Some insight can be gained by considering it in the framework of hydraulic groundwater theory [e.g., Brutsaert, 2005]. In the linearized version of this approach the local long-time outflow rate q, from an initially saturated homogeneous unconfined aquifer underlain by a horizontal impermeable bed, into a shallow river channel is
In (6)q is expressed as a volume per unit time and per unit length of river channel, the symbol k0 is the hydraulic conductivity, p is an empirical weighting constant ranging roughly between 0.3 and 0.1 (or perhaps even smaller as the water table further declines in the range of lowest flows), D is the vertical thickness of the saturated layer at t = 0, B is its breadth, that is the distance from the channel to the valley divide, ne is its drainable porosity, and t is the time after the start of drainage. A basin-wide outflow rate can be derived by simply assuming first, that the parameters in (6) are representative for the entire upstream basin, so that q can be considered as an average value, related to the total basin outflow by
and second, that an effective aquifer breadth can be defined as
In (7) and (8)L is the total length of all tributary and main river sections upstream from the gaging station, where the streamflow Q is measured, and (L/A) = Dd is known as the drainage density. These two assumptions allow (6) to be rewritten in the form of (5), with a drainage timescale roughly of the order of
in which Te = k0pD is the effective hydraulic transmissivity of the aquifers in the basin.
Equation (9) indicates that the basin drainage timescale depends mainly on the drainable porosity ne the hydraulic transmissivity Te of the contributing unconfined aquifers in the basin, and on the drainage density of the river network Dd. While river basins, especially the larger ones, may have a wide variety of geological features within them, through erosion and weathering processes with time the physical properties of the resulting aquifers and the structure of the river network will tend to evolve to a certain equilibrium state. There are strong experimental indications that the drainage density Dd is often quite insensitive to basin size A within regions with homogeneous lithology [e.g., Morisawa, 1962; Brutsaert and Lopez, 1998]; while no doubt other factors are at play [e.g., Gregory and Gardiner, 1975; Gardiner et al., 1977], this near independence of Dd on A is most pronounced and definite, whenever the drainage pattern network exhibits some kind of similarity or fractal behavior [e.g., Smart, 1972, p. 334; Maritan et al., 1996], as is often the case. Moreover, it stands to reason that landscapes with higher transmissivities k0pD tend to need fewer channels per unit area and thus have smaller drainage densities Dd, and vice versa; in other words, in regions with variable lithological characteristics, the product (Dd2Te) is likely to be less variable and more robust than either Dd2 or Te separately. The drainable porosity ne varies within a relatively narrow range; in any event, it tends to be larger for more permeable materials, and vice versa, so that the ratio (ne/Te) should also be relatively more invariant. All this suggests on the basis of (9), that in large basins many of the controlling effects will “average out” and/or compensate one another; thus, K values in different basins within a similar climate can be expected to be of a similar order of magnitude and to vary within a relatively narrow range, independently of drainage area A or terrane.
 This is borne out by observational evidence. For instance, as mentioned, Barnes  was among the first to estimate K from careful drought flow observations; for the Iowa River he obtained K = 50 days at Iowa City (A = 8360 km2), and K = 58 days at Marshalltown (A = 3880 km2); he later [Barnes, 1959] felt that K = 50 days is broadly applicable, and used it in his analysis of the Karnafuli River (A = 9840 km2) in Bangladesh. Linsley et al  derived K = 33 days for Stony Creek at Johnstown, Pennsylvania (A = 1170 km2). After screening out days affected by rainfall events, Brutsaert and Lopez  analyzed drought flows in 21 catchments within the Washita River Experimental Watershed complex in Oklahoma; for the 15 larger catchments ranging between 16 and 540 km2, they obtained an average value (±σ) of K = 45 ± 14 days. More recently, Brutsaert and Sugita  obtained K = 43 days for the Kherlen River at Undurkhaan in Mongolia (A = 39,400 km2). Using a totally different approach, in the calibration of their HSPF hydrologic model, Singh et al. [2005, Table 1] obtained a daily groundwater recession rate parameter of 0.98 for the Iroquois River watershed (A = 5568 km2) in Illinois and Indiana, which is equivalent with K = 49 days.
 These observations confirm that K tends to be relatively invariant in larger basins, and suggest that it is of the order of 1.5 months with an uncertainty of about two weeks. Thus when K is difficult to determine or when mainly order of magnitude estimates are required, such as for climatic purposes, in large river basins a typical value of K = 45 ± 15 days can be adopted as a working assumption.
2.4. Long-Term Evolution of Groundwater Storage in the Basin
 As indicated by (1) and (2), the groundwater storage in a catchment is directly related to the base flow in the river. However, base flow goes through highs and lows in response to antecedent precipitation events over the region; therefore, for any given period it is necessary to select a feature of the base flow record that best characterizes the groundwater storage over that same period. While other choices would certainly be possible, an objective way to track the long term evolution of this base flow over many years consists of monitoring its lowest level each year. Indeed, this represents the base flow which is the most sustainable in the course of that year and which can be depended upon for next year. This means that in principle the long term trend of dependable storage should be derivable from the trend of the lowest daily flows for each year of the period of record. However, because individual daily flows are normally subject to error and other uncertainties, it may be advisable to use some type of running average such as the annual lowest seven-day flows, denoted here as yL7 = QL7/A, as a more reliable measure for this purpose. The annual lowest seven-day flow is a common measure used in drought statistics.
3. Example Application and Validation of the Concept in Illinois
3.1. Available Streamflow and Groundwater Level Data
 Long term streamflow data are available for many river basins in the world where the proposed method can be applied. However, there are very few instances where also concurrent groundwater storage observations have been made, with which the method can be tested. One notable exception is the State of Illinois where in the late 1950s under the leadership of William C. Walton a shallow groundwater observation well network was put into operation by the Illinois State Water Survey (ISWS). Not all the wells were installed at once, and meanwhile some have been discontinued. Most of the observation wells with the longest records are located within two major river basins, whose streamflow has been recorded by the U.S. Geological Survey since the early 1940s, and which cover around 64 % of the State's total area. The site identifiers of the two river gaging stations are USGS 05586100 Illinois River at Valley City, IL (A = 69,227 km2) and USGS 05446500 Rock River near Joslin, IL (A = 24,721 km2); the daily discharge data at these two stations which were used in this study were obtained from the USGS web interface (for a map see also http://www.sws.uiuc.edu/warm/pmfd/stations.asp). Nine groundwater observation well stations, with records starting no later than 1965, were deemed suitable for the analysis. Their identifiers are Barry (w61), Greenfield (w132), Snicarte (w91), and SWS No.2 (w181), in the Illinois River Basin; Galena (w21), Mt. Morris (w31), Cambridge (w11), and Crystal Lake (w41) in the Rock River Basin; St. Peter (w153) outside but not far from the Illinois River Basin. Stations w11 and w41 are nearly on the divide so that they can also be included in the calculations for the Illinois River Basin; the data of w153 were only used in the calculations of the overall trends for the combined basin. The other stations in the Illinois well observation network either had records that were too short or they were too far outside the Illinois River and Rock River catchments. The data and a location map of the nine selected well stations can be obtained from the ISWS web interface. To avoid the likelihood of ice conditions, which normally occur in winter and which invalidate the groundwater-base flow relationship (2), use was made only of river discharge and groundwater level data recorded between April 15 and October 31.
3.2. Drainage Timescale of the River Basins
 The long-term trend in groundwater storage dS/dt can be directly estimated from the base flow trend dy/dt by means of (2), but this requires a knowledge of the characteristic drainage timescale K. Although, as noted in section 2.3, K is relatively invariant, it was decided to determine it afresh for both river basins in Illinois of the present application with the method of Brutsaert and Nieber [1977; Brutsaert, 2005]. In this procedure, which can also accommodate possible nonlinear behavior like (1), the drought flows are analyzed by considering the lowest envelope of a logarithmic plot of (dy/dt) data versus the corresponding y data; in the special linear case this relationship is given by (4) and K can be directly determined. The method was implemented by plotting values of (yi−1 − yi+1)/2 against yi (in which the subscript i refers to the flow on the i-th day) during recessions of the daily flow time series. Ideally in this method flows which are not strictly base flow, that is, flows during and immediately following precipitation, must be eliminated. However, in basins of this size the raingage network density is inadequate to capture all events everywhere in the basin. As mentioned earlier, no objective method is available to estimate K from streamflow data. In the present case, to maximize the likelihood of selecting flows during recessions that constituted “pure” drought flow, the following criteria were used in the selection process. Eliminate all data points with positive and zero values of dy/dt, and also sudden anomalous ones; eliminate 3 data points after the last positive and zero dy/dt, and 4 data points after major events; eliminate 2 data points before dy/dt becomes positive or zero; eliminate data points in a drying sequence, which are suddenly followed by a data point with a larger value of −dy/dt. This procedure results in a cloud of points. To make some allowance for the unavoidable error in these data points, the lower envelope was established by keeping roughly 5% of the points below it. This is illustrated in Figure 1 for the case of the Rock River. The straight line shown has a unit slope in accordance with (4) with a value of the coefficient K = 37 days. Application of the method with the discharge data of the Illinois River yielded K = 46 days. It is reassuring that both values are of the same order as those mentioned for the other large river basins in section 2.3.
3.3. Basin-Scale Groundwater Storage Trends
 The groundwater storage trends were determined as follows. First, the lowest seven-day flow yL7 was determined for each year of record, as the lowest value of the seven-day running averages. Figure 2 illustrates the evolution of these flows observed on the Illinois River at Valley City. The temporal trends of the annual low flows yL7 were then directly calculated by simple linear regression from which the trends of the annual groundwater storage were estimated on the basis of (2), that is by means of
 The results of the calculations for the two river basins over different periods are displayed in Table 1. The selected periods are the period of record, which is 2/3 of a century, viz. 1940–2006; the second half of the twentieth century which is often used as a benchmark period viz. 1950–1999; the period of record prior to the Great Midwest Flood viz. 1940–1992; the period after (and excluding) the Great Flood, and also the most recent period of persistent change viz. 1994–2006; the longest period for which simultaneous area-wide and continuous well observations are available viz. 1965–2000; and the period for which average soil moisture trends in Illinois have been published viz. 1981–1998.
Table 1. Average Groundwater Storage Trends (in mm a−1), as Derived From Low Flow Measurements in Two Major River Basins in Illinoisa
Asterisks indicate trend values for which the probability of being different from zero is at least 0.95.
Illinois River at Valley City, Ill. (69,227 km2)
Rock River near Joslin, Ill. (24,721 km2)
 It can be seen that groundwater storage has generally been increasing since the early 1940s until the present, at an average rate of around 0.05 to 0.10 mm a−1. These trends are relatively small as compared to the annual variations of the storage values; for example for 1940–2006, the standard deviation of the annual values of S is σ = 4.53 mm for the Illinois, and σ = 3.83 mm for the Rock River. As also illustrated in Figure 2, the trend has not been stationary. Clearly, the anomalously high flow of 1993 had a marked effect on the overall statistics; this was the year of the Great Midwest Flood, and it is a valid question whether any “pure” base flow was even possible during this episode. Nevertheless, as can be seen in Table 1 for 1940–1992, prior to the flood the trend was nearly the same and equally significant as over the entire period of record. Also, if this flow were eliminated from the entire record, the calculated trends would decrease only from 0.058 to 0.036 mm a−1 for the Illinois River and from 0.095 to 0.084 mm a−1 for the Rock River and both trends would remain still significant at the 0.05 level.
 Because the trend is non-stationary, it is possible to identify shorter sub-periods with considerably different positive or also negative trends. This can be seen for the most recent period 1994–2006, after (but excluding) the Great Flood, during which storage has continued to decrease. Although this recent trend may not be significant, it should be a caution against drawing categorical conclusions about a positive or negative trend in the context of global change on the basis of 50-year records. The last column of Table 1 shows the groundwater storage trends for 1981–1998. For this period in Illinois, Robock et al.  published an average trend in summer soil moisture of 1.9 mm a−1. This is much larger than the corresponding values in Table 1 calculated here from the base flows. Without further analysis it is not immediately clear why soil moisture content would have increased more than the groundwater storage by more than an order of magnitude. To bring both sets of values to a comparable magnitude [see (10)] would require values of the drainage timescale K of the order of 500 days or even more, which are not realistic. This discrepancy, if indeed real, will need further study.
3.4. Comparison and Validation With Well Observations
 For each of the well stations the trends of the lowest observed warm season water table levels dη/dt were calculated over two different periods; these are the period of record of each well, and the longest period for which simultaneous area-wide and continuous observations are available at all of the wells, viz. 1965–2000. The results are presented in Table 2. The same exercise was also carried out for the trends of the average warm season water table levels; however, these results were generally similar to those of Table 2, albeit about 1.5 times larger on average, and are therefore not presented here. For each of these lowest water table level trends dη/dt, the drainable porosity ne was calculated by means of the corresponding groundwater storage trends dS/dt of the river basin, as follows
The results of the calculations with (11) are given in brackets beside the values of the water table trends in Table 2.
Table 2. Trends of the Annual Lowest Water Table Levels (in mm a−1) Measured at the Indicated Locations in Illinois for Different Periods Over the Past Fifty Yearsa
(Individual Well Record Period)
The concomitant drainable porosities ne obtained by comparison with the groundwater storage trends for the same periods are shown in brackets. Asterisks indicate trend values for which the probability of being different from zero is at least 0.95.
Illinois River Basin
6.581 (0.00937) (1956–2006)
−0.3425 (NA) (1965–2000)
Crystal Lake, 41
9.164* (0.01154) (1951–2000)
−11.14 (NA) (1965–2006)
−3.571 (NA) (1958–2006)
SWS No.2, 181
40.68* (0.00283) (1952–2001)
Rock River Basin
−0.3425 (NA) (1965–2000)
Crystal Lake, 41
9.164* (0.01509) (1951–2000)
7.829 (0.01187) (1964–2006)
Mt. Morris, 31
49.434 (0.00335) (1961–2000)
 One striking feature of Table 2 is that some of the trends are negative and thus inconsistent with the positive trends of the average groundwater storage derived from the river base flows for the same periods. For negative trends (11) is not applicable, and for these cases the drainable porosity is indicated as NA. Because the well locations appear to have no special distribution pattern in the two basins, the averages listed in Table 2 were taken simply without weighting. It can be seen that the average trends dη/dt for both basins are positive, and that the value of the Illinois River basin, 0.90, is smaller than that of the Rock River basin, 10.42; these two features are consistent with the groundwater storage trends dS/dt shown in Table 1 for 1965–2000. Nevertheless, the trends dη/dt and the drainable porosities ne shown in Table 2 vary widely, and it is not easy to discern a pattern.
 As a better illustration of the overall relationship between aquifer storage and base flow, Figure 3 shows the time series of the total flows yL7 from the two combined basins, and of the average values of the water table heights at the nine wells. The two series appear to follow similar changes and the correlation of 0.66 between them is probably as high as can be expected for this type of large-scale field data. With the values of the trends shown in Figure 3 and with an average drainage timescale of K = 43.6 d, weighted with the two drainage areas, one obtains an average groundwater storage trend dS/dt = 0.08352 mm a−1 with (10), and an overall average drainable porosity ne = 0.01318 with (11). As an aside, it was mentioned that the same exercise was also carried out with the average (instead of the lowest) annual observed water table levels; in this case, the overall gradient of the nine well water levels was d〈η〉/dt = 9.309 mm a-1, which yields a drainable porosity ne = 0.00897 of roughly the same order, albeit a little smaller.
 The order of magnitude of the drainable porosity, namely 0.01, is certainly reasonable and provides a sound measure validating the use of streamflow records to estimate groundwater storage trends, as described herein. For instance, the obtained values of ne are very similar in magnitude with those obtained with an entirely different methodology in 21 catchments in the southern Great Plains by Brutsaert and Lopez [1998, Figure 9]. The present values are, however, somewhat smaller than the values proposed by Johnson  for similar types of aquifer materials (fine silt, silt loam or sandy clay) and often used in practice for want of better alternatives; but these values were derived largely from column studies in the laboratory and local pumping tests, rather than on the basis of large-scale catchment observations. This confirms, what has been known or suspected for some time, that the values listed by Johnson  may indeed be on the large side, particularly at larger spatial scales. In any event, comparison of (10) and (11) indicates that larger values of ne can be obtained only as a result of smaller water table trends dη/dt or of larger drainage timescales K. The former is a possibility, but that would require that the current observation well network in Illinois is not representative for the average conditions; the second possibility is unlikely, as most other methods in the literature to estimate K tend to produce smaller values than the method used herein. A possible reason for the difference between the drainable porosities ne derived from streamflow data and values obtained locally or in the laboratory, could be, as surmised by Brutsaert and Nieber , that under drought flow conditions with lower water tables, the “effective” catchment area contributing to the streamflow is perhaps smaller than the total catchment area A. However, this is not easy to substantiate or to quantify. A similar argument was developed by Eltahir and Yeh [1999, Figure 17], who used it to explain the nonlinearity of the groundwater rating curve at higher flow rates, well above those considered here.
 The Illinois water table observations have been used in some other studies. However, unlike the present analysis, these were more concerned with shorter term and seasonal variations. Eltahir and Yeh  made a thorough analysis of the water budget of Illinois for the 12 year period 1983–1994; as one aspect of this for dry conditions, when presumably base flows prevailed, they established a linear relationship between monthly river flow and average groundwater level, namely (in the present notation) in the absence of recharge, y = 0.006 η. Since S = neη, comparison with (2) shows that
Therefore the result of Eltahir and Yeh  with the presently obtained value of ne of around 0.01, implies a drainage timescale of around 0.01/0.006 = 1.67 months or 50 d; as shown above, this is well within the range of expected K values, and thus consistent with the results of the present study. It also indicates that if any larger value of the drainable porosity were adopted, such as 0.1 or larger as suggested in the table by Johnson , the resulting K value would become of the order of 500 days or longer, which is outside the range of any previously published values for base flow recessions. In another noteworthy study, Seneviratne et al.  presented a methodology to derive terrestrial water storage variations on the basis of three variables, namely water vapor flux convergence, atmospheric water vapor content and river runoff; the approach was applied to the Mississippi basin, and the results showed excellent agreement for Illinois, the part of the basin for which soil moisture and water table observations are available. Interestingly, in the present context, they also noted [Seneviratne et al., 2004, p. 2052, Figure 8 and Table 7]) that while the calculated seasonal changes in terrestrial storage for Illinois were well correlated with the observed values, they were always slightly smaller. The groundwater portion of the “observed” values was actually not directly observed, but calculated by means of S = neη from the measured water table levels η with a rather large drainable porosity ne = 0.08; a smaller value of ne would probably have yielded an even better agreement between their estimates and the observations.
 During periods without precipitation the flows observed in a natural stream are supplied largely by drainage from the unconfined aquifers along the banks of the upstream river system; the outflow rate from this type of riparian aquifer is directly controlled by the level of groundwater stored in it. Therefore long term streamflow records in a natural river system can be used to derive reliable quantitative information on the trends in groundwater storage over the same period.
 The proposed method was tested with available data from the Illinois River and from the Rock River basins in Illinois, one of the few regions where also long-term groundwater level measurements have been recorded. Base flow analysis confirmed that the characteristic (or e-folding) drainage timescale for large basins of this type is of the order of K = 45 ± 15 days. With this order of magnitude, the groundwater storage in Illinois over the past 2/3 century was estimated to have increased significantly by between 0.5 and 1 mm per decade, on average. Not unexpectedly, this trend was not stationary; for instance over the most recent decade or so, since the mid nineteen nineties groundwater storage appears to have been decreasing. This is a reminder that a fifty year record may still not be sufficient to determine the plausibility or robustness of an observed trend.
 The groundwater storage trends derived from the base flow record conform with the average trends measured in an observation well network over the same area. This good agreement at the basin scale provides validation of the method proposed herein. Comparison between these two different sets of trends indicates that the drainable porosity ne in the low flow drainage regime is of the order of 0.01. This is somewhat smaller than the “classical” values based on column studies and local pumping tests for the same types of aquifer materials, but it is consistent with more recent findings for larger catchments. It is also notable that the trends derived from some of the individual well records run counter to the basin-scale trends derived from the streamflow records; agreement between the two sets of trends could only be achieved after averaging the well records in both basins. Thus measurements in a single well are not always a good indicator of regional conditions.
 Part of this work was carried out while on leave at the University of Tsukuba with support from the Japan Society for the Promotion of Science; helpful discussions with Michiaki Sugita are gratefully acknowledged.