SEARCH

SEARCH BY CITATION

References

  • Abramowitz, M. & Stegun, I.A. (1964). Handbook of Mathematical Functions. New York: Dover.
  • Akaike, H. (1973). Information theory and an extension of the maximum likelihood principle. In Proceedings of the 2nd International Symposium on Information Theory, eds. B.N. Petrov & F. Csaki, pp. 267281. Budapest: Akadémiai Kiadó.
  • Armagan, A., Dunson, D. & Clyde, M. (2011). Generalized beta mixtures of gaussians. Adv. Neural Inf. Process. Syst. 24, 523531.
  • Beal, M.J. (2003). Variational algorithms for approximate bayesian inference. Unpublished doctoral thesis, University College London, Gatsby Computational Neuroscience Unit.
  • Bishop, C.M. (2006). Pattern Recognition and Machine Learning. New York: Springer.
  • Claeskens, G. & Hjort, N.L. (2008). Model Selection and Model Averaging. Cambridge Series in Statistical and Probabilistic Mathematics. Cambridge: Cambridge University Press.
  • Drugowitsch, J. (2008). Bayesian linear regression. Technical report, University of Rochester, Rochester, NY.
  • Erdélyi, A., Magnus, W., Oberhettinger, F. & Tricomi, F.G. (1981). Higher Transcendental Functions, Vol. III. Melbourne, FL: Robert E. Krieger Publishing Co. Inc. Based on notes left by Harry Bateman, Reprint of the 1955 original.
  • Gelman, A., Hwang, J. & Vehtari, A. (2013). Understanding predictive information criteria for Bayesian models. Statist. Comput. Arxiv.
  • Hall, P., Ormerod, J.T. & Wand, M.P. (2011). Theory of Gaussian variational approximation for a Poisson mixed model. Statist. Sinica 21, 369389.
  • Hall, P., Pham, T., Wand, M.P.& Wang, S.S.J. (2011). Asymptotic normality and valid inference for Gaussian variational approximation. Ann. Statist. 39, 25022532.
  • Humphreys, K. & Titterington, D.M. (2000). Approximate bayesian inference for simple mixtures. In Proceedings of Computational Statistics, eds. J.G. Bethlehem & P.G.M. van der Heijden, pp. 25022532. Heidelberg: Physica.
  • MacKay, D.J.C. (1995). Ensemble learning and evidence maximization. Technical report. Cavendish Laboratory University of Cambridge, Cambridge.
  • MacKay, D.J.C. (2003). Information Theory, Inference and Learning Algorithms. New York: Cambridge University Press.
  • McGrory, C.A. & Titterington, D.M. (2007). Variational approximations in Bayesian model selection for finite mixture distributions. Comput. Statist. Data Anal. 51, 53525367.
  • Mitchell, T.J. & Beauchamp, J.J. (1988). Bayesian variable selection in linear regression. J. Amer. Statist. Assoc. 83, 10231036.
  • Murphy, K.P. (2012). Machine Learning: A Probabilistic Perspective. London: The MIT Press.
  • Neville, S.E., Ormerod, J.T. & Wand, M.P. (2013). Mean field variational bayes for continuous sparse signal shrinkage: pitfalls and remedies. Preprint.
  • Ormerod, J.T. & Wand, M.P. (2010). Explaining variational approximations. Amer. Statist. 64, 140153.
  • Ormerod, J.T. & Wand, M.P. (2012). Gaussian variational approximate inference for generalized linear mixed models. J. Comput. Graph. Statist. 21, 217.
  • Pauler, D.K. (1998). The schwarz criterion and related methods for normal linear models. Biometrika 85, 1327.
  • Ren, Q., Banerjee, S., Finley, A.O. & Hodges, J.S. (2011). Variational Bayesian methods for spatial data analysis. Comput. Statist. Data Anal. 55, 31973217.
  • Robert, C.P. & Marin, J.M. (2007). Bayesian Core, A Practical Approach to Computational Bayesian Statistics. New York: Springer.
  • Rue, H., Martino, S. & Chopin, N. (2009). Approximate Bayesian inference for latent Gaussian models by using integrated nested Laplace approximations. J. R. Stat. Soc. Ser. B Stat. Methodol. 71, 319392.
  • Schwarz, G. (1978). Estimating the dimension of a model. Ann. Statist. 6, 461464.
  • Spiegelhalter, D.J., Best, N.G., Carlin, B.P. & Van Der Linde, A. (2002). Bayesian measures of model complexity and fit (with Discussion). J. R. Stat. Soc. Ser. B Stat. Methodol. 64, 583639.
  • Teh, Y.W., Kurihara, K. & Welling, M. (2007). Collapsed variational inference for HDP. Adv. Neural Inf. Process. Syst. 20.
  • Teh, Y.W., Newman, D. & Welling, M. (2006). A collapsed variational bayesian inference algorithm for latent dirichlet allocation. Adv. Neural Inf. Process. Syst. 19, 13531360.
  • Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. J. R. Stat. Soc. Ser. B Stat. Methodol. 58, 267288.
  • Volant, S., Magniette, M.L. & Robin, S. (2012). Variational Bayes approach for model aggregation in unsupervised classification withMarkovian dependency. Comput. Statist. Data Anal. 56, 23752387.
  • Wang, B. & Titterington, D.M. (2006). Convergence properties of a general algorithm for calculating variational Bayesian estimates for a normal mixture model. Bayesian Anal. 1, 625650.
  • Welling, M., Teh, Y.W. & Kappen, H.J. (2008). Hybrid variational/gibbs collapsed inference in topic models. Proc. Int. Conf. Uncertainty Artif. Intell. 24, 587591.
  • Yang, Y. (2005). Can the strengths of AIC and BIC be shared? A conflict between model indentification and regression estimation. Biometrika 92, 937950.
  • You, C., Ormerod, J.T. & Müller, S. (2013). A variational Bayes approach to variable selection. Preprint.