next up previous
Next: Introduction

Dimension Reduction Regression in

Sanford Weisberg
School of Statistics, University of Minnesota, St. Paul, MN 55108-6042.
Supported by National Science Foundation Grant DUE 0109756.

January 10, 2002

Abstract:

Regression is the study of the dependence of a response variable $y$ on a collection $p$ predictors collected in $x$. In dimension reduction regression, we seek to find a few linear combinations $\beta_1'x,\ldots,\beta_d'x$, such that all the information about the regression is contained in these $d$ linear combinations. If $d$ is very small, perhaps one or two, then the regression problem can be summarized using simple graphics; for example, for $d=1$, the plot of $y$ versus $\beta_1'Xx$ contains all the regression information. When $d=2$, a 3D plot contains all the information. Several methods for estimating $d$ and relevant functions of $\beta_1\ldots,\beta_d$ have been suggested in the literature. In this paper, we describe an package for three important dimension reduction methods, including sliced inverse regression or sir, sliced average variance estimates, or save, and principal Hessian directions, or phd. The package is very general and flexible, and can be easily extended to include other methods of dimension reduction. It includes tests and estimates of the dimension $d$, estimates of the relevant information including $\beta_1,\ldots,\beta_d$, and some useful graphical summaries as well.




next up previous
Next: Introduction
Sandy Weisberg 2002-01-10