SEARCH

SEARCH BY CITATION

Keywords:

  • Akaike's Information Criterion (AIC);
  • Akaike-best model;
  • model averaging;
  • model selection;
  • parameter selection;
  • uninformative parameters

Abstract: As use of Akaike's Information Criterion (AIC) for model selection has become increasingly common, so has a mistake involving interpretation of models that are within 2 AIC units (ΔAIC ≤ 2) of the top-supported model. Such models are <2 ΔAIC units because the penalty for one additional parameter is +2 AIC units, but model deviance is not reduced by an amount sufficient to overcome the 2-unit penalty and, hence, the additional parameter provides no net reduction in AIC. Simply put, the uninformative parameter does not explain enough variation to justify its inclusion in the model and it should not be interpreted as having any ecological effect. Models with uninformative parameters are frequently presented as being competitive in the Journal of Wildlife Management, including 72% of all AIC-based papers in 2008, and authors and readers need to be more aware of this problem and take appropriate steps to eliminate misinterpretation. I reviewed 5 potential solutions to this problem: 1) report all models but ignore or dismiss those with uninformative parameters, 2) use model averaging to ameliorate the effect of uninformative parameters, 3) use 95% confidence intervals to identify uninformative parameters, 4) perform all-possible subsets regression and use weight-of-evidence approaches to discriminate useful from uninformative parameters, or 5) adopt a methodological approach that allows models containing uninformative parameters to be culled from reported model sets. The first approach is preferable for small sets of a priori models, whereas the last 2 approaches should be used for large model sets or exploratory modeling.