Many analyses of human populations have found that age-specific mortality rates increase faster across most of adulthood when overall mortality levels decline. This contradicts the relationship often expected from Williams’ classic hypothesis about the effects of natural selection on the evolution of senescence. More likely, much of the within-species difference in actuarial aging is not due to variation in senescence, but to the strength of filters on the heterogeneity of frailty in older survivors. A challenge to this differential frailty hypothesis was recently posed by an analysis of life tables from historical European populations and traditional societies that reported variation in actuarial aging consistent with Williams’ hypothesis after all. To investigate the challenge, we reconsidered those cases and aging measures. Here we show that the discrepancy depends on Ricklefs’ aging rate measure, ω, which decreases as mortality levels drop because it is an index of mortality level itself, not the rate of increase in mortality with age. We also show unappreciated correspondence among the parameters of Gompertz–Makeham and Weibull survival models. Finally, we compare the relationships among mortality parameters of the traditional societies and the historical series, providing further suggestive evidence that differential heterogeneity has strong effects on actuarial aging.