Get access

Distributions of the Kullback–Leibler divergence with applications

Authors

  • Dmitry I. Belov,

    Corresponding author
    1. Law School Admission Council, Newtown, Pennsylvania, USA
      Correspondence should be addressed to Dr Dmitry I. Belov, Law School Admission Council, 662 Penn Street, Newtown, PA 18940, USA (e-mail: dbelov@lsac.org).
    Search for more papers by this author
  • Ronald D. Armstrong

    1. Rutgers University, New Brunswick, New Jersey, USA
    Search for more papers by this author

Correspondence should be addressed to Dr Dmitry I. Belov, Law School Admission Council, 662 Penn Street, Newtown, PA 18940, USA (e-mail: dbelov@lsac.org).

Abstract

The Kullback–Leibler divergence (KLD) is a widely used method for measuring the fit of two distributions. In general, the distribution of the KLD is unknown. Under reasonable assumptions, common in psychometrics, the distribution of the KLD is shown to be asymptotically distributed as a scaled (non-central) chi-square with one degree of freedom or a scaled (doubly non-central) F. Applications of the KLD for detecting heterogeneous response data are discussed with particular emphasis on test security.

Get access to the full text of this article

Ancillary