Nutrient resorption is an important process during leaf senescence, which helps plants to minimize nutrient losses. To quantify nutrient resorption, the parameter resorption efficiency is commonly used. This parameter describes the percentage of the nutrient pool withdrawn before leaf abscission. The nutrient pool is generally expressed on the basis of leaf mass or leaf area, assuming that these bases do not change during senescence. In this paper we firstly present a mathematical formula describing the effect of change in measurement basis on the difference between the real resorption efficiency (RRE) value and the measured resorption efficiency (MRE). This formula shows that even moderate senescence-related changes in a measurement basis can lead to considerable underestimation of RRE. Secondly, to estimate the general change in measurement basis we quantified leaf mass loss and leaf shrinkage during senescence from literature data. These data shows that mass loss percentages can be as high as 40%, and leaf shrinkage can be up to 20%. This level of change in basis seriously compromises the MRE when not corrected for. Using our formula and the reported average literature values of changes in leaf mass (21%) and leaf shrinkage (11%) during senescence, we calculated that the average RRE for nitrogen and phosphorous of terrestrial plants is 6% (leaf area) to 10% (leaf mass) higher than the 50%, respectively 52% as reported by Aerts (1996). This implies that nutrient resorption from senescing leaves is even more important for nutrient retention in terrestrial plants than thought so far. We advocate that preselecting leaves and monitoring the measurement basis throughout the duration of the experiment should minimize the difference between MRE and RRE.