Standard Article

Interrater Agreement

International Statistical Reviewto Line Intersect Sampling

  1. Mousumi Banerjee

Published Online: 15 AUG 2006

DOI: 10.1002/0471667196.ess3164.pub2

Encyclopedia of Statistical Sciences

Encyclopedia of Statistical Sciences

How to Cite

Banerjee, M. 2006. Interrater Agreement. Encyclopedia of Statistical Sciences.

Author Information

  1. University of Michigan, Ann Arbor, MI

Publication History

  1. Published Online: 15 AUG 2006

Abstract

Analysis of interrater or interobserver agreement data provides a useful means of assessing the reliability of a rating system. This article presents various interrater agreement measures as well as models for studying agreement, when the relevant data comprise either continuous or categorical ratings from multiple raters. Descriptions and characterizations of the underlying models are presented in detail, as well as methods for estimation and confidence interval construction. The scenarios and designs that underlie the development of the agreement measures are described, and the interrelationships between these measures discussed.

Keywords:

  • Cohen's kappa;
  • intraclass correlation coefficient;
  • tetrachoric correlation coefficient;
  • concordance correlation coefficient;
  • log-linear models