Abstract
We propose a general dimension-reduction method that combines the ideas of likelihood, correlation, inverse regression and information theory. We do not require that the dependence be confined to particular conditional moments, nor do we place restrictions on the predictors or on the regression that are necessary for methods like ordinary least squares and sliced-inverse regression. Although we focus on single-index regressions, the underlying idea is applicable more generally. Illustrative examples are presented.
Original language | English (US) |
---|---|
Pages (from-to) | 371-384 |
Number of pages | 14 |
Journal | Biometrika |
Volume | 92 |
Issue number | 2 |
DOIs | |
State | Published - Jun 2005 |
Bibliographical note
Funding Information:The authors would like to thank the editor and two referees whose suggestions led to a greatly improved paper. The work of Cook was supported in part by grants from the U.S. National Science Foundation.
Keywords
- Central mean subspace
- Central subspace
- Dimension-reduction subspace
- Regression graphics
- Single-index model