Traditional analysis of aquifer tests uses the observed drawdown at one well, induced by pumping at another well, for estimating the transmissivity (T) and storage coefficient (S) of an aquifer. The analysis relies on Theis' solution or Jacob's approximate solution, which assumes aquifer homogeneity. Aquifers are inherently heterogeneous at different scales. If the observation well is screened in a low-permeability zone while the pumping well is located in a high-permeability zone, the resulting situation contradicts the homogeneity assumption in the traditional analysis. As a result, what does the traditional interpretation of the aquifer test tell us? Using numerical experiments and a first-order correlation analysis, we investigate this question. Results of the investigation suggest that the effective T and S for an equivalent homogeneous aquifer of Gaussian random T and S fields vary with time as well as the principal directions of the effective T. The effective T and S converge to the geometric and arithmetic means, respectively, at large times. Analysis of the estimated T and S, using drawdown from a single observation well, shows that at early time both estimates vary with time. The estimated S stabilizes rapidly to the value dominated by the storage coefficient heterogeneity in between the pumping and the observation wells. At late time the estimated T approaches but does not equal the effective T. It represents an average value over the cone of depression but influenced by the location, size, and degree of heterogeneity as the cone of depression evolves.