SEARCH

SEARCH BY CITATION

Keywords:

  • Cohen's kappa;
  • intraclass;
  • reliability;
  • multilevel;
  • Markov chain Monte Carlo;
  • nested;
  • rater

Kappa-like agreement indexes are often used to assess the agreement among examiners on a categorical scale. They have the particularity of correcting the level of agreement for the effect of chance. In the present paper, we first define two agreement indexes belonging to this family in a hierarchical context. In particular, we consider the cases of a random and fixed set of examiners. Then, we develop a method to evaluate the influence of factors on these indexes. Agreement indexes are directly related to a set of covariates through a hierarchical model. We obtain the posterior distribution of the model parameters in a Bayesian framework. We apply the proposed approach on dental data and compare it with the generalized estimating equations approach. Copyright © 2012 John Wiley & Sons, Ltd.