Standard Article

Multiple Regression

  1. Barry H. Cohen

Published Online: 30 JAN 2010

DOI: 10.1002/9780470479216.corpsy0570

Corsini Encyclopedia of Psychology

Corsini Encyclopedia of Psychology

How to Cite

Cohen, B. H. 2010. Multiple Regression. Corsini Encyclopedia of Psychology. 1.

Author Information

  1. New York University

Publication History

  1. Published Online: 30 JAN 2010


The simplest and most commonly used form of multiple regression (MR) is multiple linear regression. This method uses a linear combination (i.e., a weighted average) of predictor variables to maximize the accuracy with which a criterion variable can be predicted; the end result is a multiple regression equation in which each predictor variable is multiplied by its optimal weight, and a constant (called the Y-intercept) is added. The accuracy or lack of accuracy of a regression equation is measured by summing the squared differences between the predicted and actual criterion values for each case in the dataset; this quantity is symbolized as SSerror (or SSresidual). The set of weights, called partial regression slopes, that lead to the smallest value of SSerror, is known as the Ordinary Least Squares (OLS) regression solution; these same weights produce the largest possible value for R. R2 equals the proportion of criterion variance that is accounted for by a multiple regression equation, and the increase in R2 produced by adding a predictor variable is equal to the square of what is called the semipartial (or part) correlation of that predictor with the criterion, given the other predictors that were already in the equation.


  • semipartial correlation;
  • partial regression slope;
  • beta weight;
  • stepwise regression;
  • hierarchical regression