A series of numerical experiments was carried out to test the hypothesis that temporal variability of rainfall intensity during a storm can cause the commonly observed decrease in runoff coefficients with increasing slope length. The results demonstrate significant effects over even relatively short slope lengths. Sensitivity analyses show that the scale dependency of measured runoff coefficients is most sensitive to the infiltration parameters of the slope. Furthermore, it is also sensitive to the slope angle and the friction factor of the surface, because these parameters control the depth of overland flow. These results suggest that the combination of time-varying rainfall intensity during an event and run-on infiltration can provide an alternative to spatial variation in infiltration as an explanation for the scale-dependency of runoff coefficients that has been observed in the field. Overland-flow models which simply use the mean rainfall intensity are also shown to underpredict the runoff quite dramatically. The results imply that a better understanding of the temporal variability of rainfall intensities is important in both understanding field measurements and developing robust models of overland flow.