# 10. Multivariate Regression

Published Online: 27 MAR 2003

DOI: 10.1002/0471271357.ch10

Copyright © 2002 John Wiley & Sons, Inc.

Book Title

## Methods of Multivariate Analysis, Second Edition

Additional Information

#### How to Cite

Rencher, A. C. (2002) Multivariate Regression, in Methods of Multivariate Analysis, Second Edition, John Wiley & Sons, Inc., New York, NY, USA. doi: 10.1002/0471271357.ch10

#### Publication History

- Published Online: 27 MAR 2003
- Published Print: 22 FEB 2002

#### Book Series:

#### ISBN Information

Print ISBN: 9780471418894

Online ISBN: 9780471271352

- Summary
- Chapter

### Keywords:

- fixed
*x*'s; - random
*x*'s; - least squares;
- overall regression test;
- full model;
- reduced model;
*F*-tests;- partial
*F*-tests; - multiple correlation;
- stepwise regression;
- Gauss–Markov theorem;
- unbiased estimates;
- regression sum of squares and products matrix;
- error matrix;
- eigenvalues;
- subset selection

### Summary

In multivariate linear regression, we consider a model containing several *y*'s (dependent variables) and several *x*'s (independent variables). This is an extension of multiple regression, in which one *y* is regressed on several *x*'s. A substantial review of the multiple regression model is given before proceeding with the multivariate regression model. The review includes the model and assumptions, centered form of the model, least squares estimation of regression coefficients in vector form and in covariance form, estimation of the basic variance, hypothesis tests of regression coefficients, *R*^{2}, and selecting a subset of the *x*'s. Subset selection can be based on all possible subsets (using the three criteria *R*^{2}_{p}, *s*^{2}_{p}, and *C _{p}*) or based on stepwise selection (using a partial

*F*).

In the multivariate regression model, several *y*'s and several *x*'s are measured on each experimental unit. The assumptions about the model and about the distribution of the *y*'s are important and should be checked. The matrix of regression coefficients can be estimated by least squares and can also be expressed in covariance form. The estimates have some optimal properties if the assumptions hold.

An overall regression test and a test on a subset of the regression coefficients are obtained in terms of Wilks' Λ, Roy's θ, Pillai's *V*^{(s)}, and the Lawley-Hotelling statistic *U*^{(s)}. Several measures of association between the *y*'s and the *x*'s are reviewed.

Subset selection procedures are discussed for both the *y*'s and the *x*'s. These procedures are based on stepwise selection (using a partial Wilks' Λ or the corresponding partial *F*) or on all possible subsets (using three criteria that represent matrix extensions of *R*^{2}_{p}, *s*^{2}_{p}, and *C _{p}*).

Examples are provided using real data. The problems ask for derivations and also illustrate most procedures with real data.