Generalized Additive Modeling with Implicit Variable Selection by Likelihood-Based Boosting
Article first published online: 2 JUN 2006
Volume 62, Issue 4, pages 961–971, December 2006
How to Cite
Tutz, G. and Binder, H. (2006), Generalized Additive Modeling with Implicit Variable Selection by Likelihood-Based Boosting. Biometrics, 62: 961–971. doi: 10.1111/j.1541-0420.2006.00578.x
- Issue published online: 2 JUN 2006
- Article first published online: 2 JUN 2006
- Received December 2004. Revised January 2006. Accepted February 2006.
- Generalized additive models;
- Likelihood-based boosting;
- Penalized stumps;
- Selection of smoothing parameters;
- Variable selection
Summary The use of generalized additive models in statistical data analysis suffers from the restriction to few explanatory variables and the problems of selection of smoothing parameters. Generalized additive model boosting circumvents these problems by means of stagewise fitting of weak learners. A fitting procedure is derived which works for all simple exponential family distributions, including binomial, Poisson, and normal response variables. The procedure combines the selection of variables and the determination of the appropriate amount of smoothing. Penalized regression splines and the newly introduced penalized stumps are considered as weak learners. Estimates of standard deviations and stopping criteria, which are notorious problems in iterative procedures, are based on an approximate hat matrix. The method is shown to be a strong competitor to common procedures for the fitting of generalized additive models. In particular, in high-dimensional settings with many nuisance predictor variables it performs very well.