SEARCH

SEARCH BY CITATION

Keywords:

  • asymptotic expansion;
  • bias correction;
  • bootstrap iteration;
  • cross-validation criterion;
  • EIC;
  • GIC;
  • leave-k-out cross-validation;
  • model selection

Abstract.  The cross-validation (CV) criterion is known to be asecond-order unbiased estimator of the risk function measuring the discrepancy between the candidate model and the true model, as well as the generalized information criterion (GIC) and the extended information criterion (EIC). In the present article, we show that the 2kth-order unbiased estimator can be obtained using a linear combination from the leave-one-out CV criterion to the leave-k-out CV criterion. The proposed scheme is unique in that a bias smaller than that of a jackknife method can be obtained without any analytic calculation, that is, it is not necessary to obtain the explicit form of several terms in an asymptotic expansion of the bias. Furthermore, the proposed criterion can be regarded as a finite correction of a bias-corrected CV criterion by using scalar coefficients in a bias-corrected EIC obtained by the bootstrap iteration.