Helland, Inge S.; Almøy, Trygve Comparison of prediction methods when only a few components are relevant. (English) Zbl 0799.62080 J. Am. Stat. Assoc. 89, No. 426, 583-591 (1994). Summary: We consider prediction in a multiple regression model where we also look on the explanatory variables as random. If the number of explanatory variables is large, then the common least squares multiple regression solution may not be the best one. We give a methodology for comparing certain alternative prediction methods by asymptotic calculations and perform such comparisons for four specific methods.The results indicate that none of these methods dominates the others, and that the difference between the methods typically (but not always) is small when the number of observations is large. In particular, principal component regression does well when the eigenvalues corresponding to components not correlated with the dependent variables (i.e., the irrelevant eigenvalues) are extremely small or extremely large. Partial least squares regression does well for intermediate irrelevant eigenvalues. A maximum likelihood-type method dominates the others asymptotically, at least in the case of one relevant component. Cited in 17 Documents MSC: 62J99 Linear inference, regression 62H25 Factor analysis and principal components; correspondence analysis Keywords:expected prediction error; prediction ability; relevant components; partial least squares regression; prediction; multiple regression model; principal component regression; irrelevant eigenvalues; maximum likelihood-type method PDFBibTeX XMLCite \textit{I. S. Helland} and \textit{T. Almøy}, J. Am. Stat. Assoc. 89, No. 426, 583--591 (1994; Zbl 0799.62080) Full Text: DOI