Application of models for estimating rainfall partitioning in deciduous forests may be considered time consuming and laborious given the need for two different parameter sets to describe leafed and leafless periods. This paper reports how rainfall partitioning modelling was done for a downy oak forest plot (Eastern Pyrenees Mountains, NE Spain) using sparse Rutter and Gash interception loss models and their suitability for such studies. Moreover, variability in model sensitivity is evaluated, and an attempt to simplify their application is also presented.
The estimation error for interception loss in the leafed period was −26.3% and −4.2% with the Rutter model and the Gash model applied with Penman–Monteith-based evaporation rate, respectively. The estimate for the leafless period was less accurate in both models, suggesting that modelling in the leafless period is more susceptible to error. Nevertheless, with the Gash model, the result was well below the expected measurement error. Models proved to be highly sensitive to change in canopy cover in all periods tested. The Rutter model was especially sensitive to zero plane displacement changes in the leafed period, while the Gash model showed high linear sensitivity to evaporation rate. In addition, a decrease in rainfall rate affects the estimation of interception loss more than an increase in it. Regardless of its high sensitivity to these parameters, the Gash model yielded a good estimate of rainfall partitioning for the total period, when only one set of parameters was used, although event-based error compensation occurred, and some periods were over or underestimated. Copyright © 2011 John Wiley & Sons, Ltd.