×

Comparison of prediction methods when only a few components are relevant. (English) Zbl 0799.62080

Summary: We consider prediction in a multiple regression model where we also look on the explanatory variables as random. If the number of explanatory variables is large, then the common least squares multiple regression solution may not be the best one. We give a methodology for comparing certain alternative prediction methods by asymptotic calculations and perform such comparisons for four specific methods.
The results indicate that none of these methods dominates the others, and that the difference between the methods typically (but not always) is small when the number of observations is large. In particular, principal component regression does well when the eigenvalues corresponding to components not correlated with the dependent variables (i.e., the irrelevant eigenvalues) are extremely small or extremely large. Partial least squares regression does well for intermediate irrelevant eigenvalues. A maximum likelihood-type method dominates the others asymptotically, at least in the case of one relevant component.

MSC:

62J99 Linear inference, regression
62H25 Factor analysis and principal components; correspondence analysis
PDFBibTeX XMLCite
Full Text: DOI