In this article, we develop a modern perspective on Akaike's information criterion and Mallows's Cp for model selection, and propose generalisations to spherically and elliptically symmetric distributions. Despite the differences in their respective motivation, Cp and Akaike's information criterion are equivalent in the special case of Gaussian linear regression. In this case, they are also equivalent to a third criterion, an unbiased estimator of the quadratic prediction loss, derived from loss estimation theory. We then show that the form of the unbiased estimator of the quadratic prediction loss under a Gaussian assumption still holds under a more general distributional assumption, the family of spherically symmetric distributions. One of the features of our results is that our criterion does not rely on the specificity of the distribution, but only on its spherical symmetry. The same kind of criterion can be derived for a family of elliptically contoured distribution, which allows correlations, when considering the invariant loss. More specifically, the unbiasedness property is relative to a distribution associated to the original density.