• Bayesian lasso;
  • Laplace prior;
  • Markov chain Monte Carlo methods;
  • Markov random fields;
  • Penalized splines;
  • Ridge regression;
  • Scale mixtures

Summary.  Data structures in modern applications frequently combine the necessity of flexible regression techniques handling, for example, non-linear and spatial effects with high dimensional covariate vectors. Whereas estimation of the former is typically achieved by supplementing the likelihood with a suitable smoothness penalty, the latter are usually assigned shrinkage penalties that enforce sparse models. We consider a Bayesian unifying perspective, where conditionally Gaussian priors can be assigned to all types of regression effects. Suitable hyperprior assumptions on the variances of the Gaussian distributions then induce the desired smoothness or sparseness properties. As a major advantage, general Markov chain Monte Carlo simulation algorithms can be developed that allow for the joint estimation of smooth and spatial effects and regularized coefficient vectors. Two applications demonstrate the usefulness of the procedure proposed: a geoadditive regression model for data from a rental guide in Munich and an additive probit model for the prediction of consumer credit defaults. In both cases, high dimensional vectors of categorical covariates will be included in the regression models. The predictive ability of the resulting high dimensional structured additive regression models compared with expert models is of particular relevance and will be evaluated on cross-validation test data.