The Geometry of Enhancement in Multiple Regression

Research output: Contribution to journalArticlepeer-review

5 Scopus citations

Abstract

In linear multiple regression, "enhancement" is said to occur when R2=b′r > r′r, where b is a p×1 vector of standardized regression coefficients and r is a p×1 vector of correlations between a criterion y and a set of standardized regressors, x. When p=1 then b≡r and enhancement cannot occur. When p=2, for all full-rank Rxx ≠ I, Rxx=E[xx′]=VΛV′ (where VΛV′ denotes the eigen decomposition of Rxx; λ1 > λ2), the set contains four vectors; the set contains an infinite number of vectors. When p ≥ 3 (and λ1 > λ2 >... > λp), both sets contain an uncountably infinite number of vectors. Geometrical arguments demonstrate that B1 occurs at the intersection of two hyper-ellipsoids in ℝp. Equations are provided for populating the sets B1 and B2 and for demonstrating that maximum enhancement occurs when b is collinear with the eigenvector that is associated with λp (the smallest eigenvalue of the predictor correlation matrix). These equations are used to illustrate the logic and the underlying geometry of enhancement in population, multiple-regression models. R code for simulating population regression models that exhibit enhancement of any degree and any number of predictors is included in Appendices A and B.

Original languageEnglish (US)
Pages (from-to)634-649
Number of pages16
JournalPsychometrika
Volume76
Issue number4
DOIs
StatePublished - Oct 2011

Keywords

  • multiple regression
  • suppression
  • suppressor variable

Fingerprint

Dive into the research topics of 'The Geometry of Enhancement in Multiple Regression'. Together they form a unique fingerprint.

Cite this