Standard Article

You have free access to this content

Hierarchical Model

Statistical Theory and Methods

  1. Dr Bradley P. Carlin

Published Online: 15 SEP 2006

DOI: 10.1002/9780470057339.vah010

Encyclopedia of Environmetrics

Encyclopedia of Environmetrics

How to Cite

Carlin, B. P. 2006. Hierarchical Model. Encyclopedia of Environmetrics. 2.

Author Information

  1. University of Minnesota, MN, USA

Publication History

  1. Published Online: 15 SEP 2006

This is not the most recent version of the article. View current version (15 JAN 2013)

A hierarchical model refers to a statistical model specified in stages, often with the purpose of borrowing strength across the various experimental units. For example, suppose our dataset consists of repeated measurements Yijkl of the toxicity detected in measurement l taken at dump site k in census block group j within census tract i. It might be natural to assume that all the observations from a particular dump site ijk are rather similar, so we might assume equation image, where N denotes the normal distribution. But we might further expect that all the site-specific mean levels θijk within a given block group ij are also similar, and thus postulate a second stage distributional assumption, namely that equation image. Natural third and fourth stages to our model might be equation image and equation image, respectively. This ‘deletion of one subscript’ as we progress up the stages of the hierarchy is typical, and particularly appropriate in many environmental applications.

Estimation and inference for hierarchical models can be performed from either a frequentist or a Bayesian point of view. To clarify the distinction, consider a simple two-stage hierarchy. Here we would have a distributional model f(y|θ) for the observed data y = (y1, …, yn) given a vector of unknown parameters θ = (θ1, …, θk). Denoting the second stage of the model by π(θ|η), the frequentist would then base inference for η on its likelihood function, which is simply the marginal distribution of the data:

  • equation image(1)

The Bayesian on the other hand would specify a third stage distribution h(η|λ) for η, and subsequently base inference on its posterior distribution, obtained by Bayes' rule as

  • equation image(2)

(see Bayesian Methods and Modeling). Either approach obviously depends on an ability to evaluate complex integrals. In the special case where equation image, and equation image (the so-called compound sampling or conditionally independent hierarchical model; see [5]), it is easy to show that equation image, where mi(yi|η) = ∫ fi (yiiii|η) dθi. This then reduces the k-dimensional integration in 1 to a product of k one-dimensional integrals. If in addition f and π are chosen as a conjugate pair of distributions (i.e. if pi|yi, η) belongs to the same distributional family as π), then mi(yi|η) will be available in closed form, avoiding the integration in 1 altogether.

While such model simplifications were once critical to fitting hierarchical models, modern computational techniques have ameliorated this need to a great extent. For instance, likelihoods of the form 1 may be maximized using the EM algorithm [2] and its variants. Posterior distributions of the form 2 may be estimated using modern Markov chain Monte Carlo (MCMC) techniques, such as the Gibbs sampler [3] and the Metropolis–Hastings algorithm [4]. MCMC methods are especially useful in hierarchical Bayesian modeling since they remain powerful even for models having a large number of stages. For example, if λ were unknown in 2 above, than we would simply add another stage to the model with corresponding distribution g(λ|μ), and extend our MCMC algorithm accordingly. Carlin and Louis [1] provide an overview of computational approaches to hierarchical modeling, as well as a review of several modern software packages for fitting hierarchical models. These include SAS Proc MIXED and the BUGS package (http://www.mrcbsu.cam.ac.uk/bugs/); see also the packages described in [6] and [7].

Acknowledgments

This work was supported in part by National Institute of Allergy and Infectious Diseases (NIAID) Grant R01-AI41966 and by National Institute of Environmental Health Sciences (NIEHS) Grant 1-R01-ES07750.

References

  1. Top of page
  2. References
  • 1
    Carlin, B.P. & Louis, T.A. (2000). Bayes and Empirical Bayes Methods for Data Analysis, 2nd Edition, Chapman & Hall/CRC Press, Boca Raton.
  • 2
    Dempster, A.P., Laird, N.M. & Rubin, D.B. (1977). Maximum likelihood estimation from incomplete data via the EM algorithm (with discussion), Journal of the Royal Statistical Society, Series B 39, 138.
  • 3
    Gelfand, A.E. & Smith, A.F.M. (1990). Sampling-based approaches to calculating marginal densities, Journal of the American Statistical Association 85, 398409.
  • 4
    Hastings, W.K. (1970). Monte Carlo sampling methods using Markov chains and their applications, Biometrika 57, 97109.
  • 5
    Kass, R.E. & Steffey, D. (1989). Approximate Bayesian inference in conditionally independent hierarchical models (parametric empirical Bayes models), Journal of the American Statistical Association 84, 717726.
  • 6
    Kreft, I.G.G., de Leeuw, J. & van der Leeden, R. (1994). Review of five multilevel analysis programs: BMDP-5V, GENMOD, HLM, ML3, and VARCL, The American Statistician 48, 324335.
  • 7
    Zhou, X.-H., Perkins, A.J. & Hui, S.L. (1999). Comparisons of software packages for generalized linear multilevel models, The American Statistician 53, 282290.