Part of the work of the second author was done while visiting the Department of Mathematics and Statistics of the University of Cyprus.
Divergences without probability vectors and their applications
Article first published online: 15 SEP 2009
Copyright © 2009 John Wiley & Sons, Ltd.
Applied Stochastic Models in Business and Industry
Volume 26, Issue 4, pages 448–472, July/August 2010
How to Cite
Sachlas, A. and Papaioannou, T. (2010), Divergences without probability vectors and their applications. Appl. Stochastic Models Bus. Ind., 26: 448–472. doi: 10.1002/asmb.803
- Issue published online: 27 AUG 2010
- Article first published online: 15 SEP 2009
- Manuscript Accepted: 25 JUL 2009
- Manuscript Revised: 24 JUL 2009
- Manuscript Received: 8 FEB 2008
- Kullback–Leibler divergence;
- Cressie–Read divergence;
- divergence with nonprobability vectors;
- graduation of mortality rates
In general, divergences and measures of information are defined for probability vectors. However, in some cases, divergences are ‘informally’ used to measure the discrepancy between vectors, which are not necessarily probability vectors. In this paper we examine whether divergences with nonprobability vectors in their arguments share the properties of probabilistic or information theoretic divergences. The results indicate that divergences with nonprobability vectors share, under some conditions, some of the properties of probabilistic or information theoretic divergences and therefore can be considered and used as information measures. We then use these divergences in the problem of actuarial graduation of mortality rates. Copyright © 2009 John Wiley & Sons, Ltd.