×

Coordinate-free multivariable statistics. An illustrated geometric progression from Halmos to Gauss and Bayes. (English) Zbl 0633.62057

Oxford Statistical Science Series, 2. Oxford: Clarendon Press. Oxford University Press. XIV, 120 p.; £20.00 (1987).
This book deals with estimation and prediction in Gaussian linear models Ex\(\in {\mathcal L}\), Cov x\(=\sigma^ 2V_ 0\), where x is an \({\mathcal E}\)- valued random vector - \({\mathcal E}\) the dual of some finite-dimensional vector-space \({\mathcal V}\)- and \(\sigma^ 2V_ 0: {\mathcal V}\to {\mathcal E}\) is the covariance-operator (since the dual of \({\mathcal E}\) is again \({\mathcal V}\) this is equivalent to the usual approach). \({\mathcal L}\subseteq {\mathcal E}\) is a linear subspace of \({\mathcal E}\). The Gauss-Markov theorem is prepared by considerations of inner products and dualities. The fact that the Gauss-Markov estimator Gx is the projection onto \({\mathcal L}\) along \(V_ 0{\mathcal L}^{\square}\) is not proved though it follows immediately from the author’s formula for \(GV_ 0\) and unbiasedness.
The prediction problem is described by \(x=h+k\), \(k\in {\mathcal K}\), \(h\in {\mathcal H}\), k observable, Ex\(\in {\mathcal L}\), Cov x\(=\sigma^ 2V_ 0\). Unfortunately it is not shown that this is equivalent to the prediction problem known from the literature in coordinate-form. Assuming a prior distribution for Ex, also the (linear) Bayes form of estimators and predictors is derived.
The multivariate linear model is considered and it is shown that the necessary and sufficient condition for the existence of a Gauss-Markov estimator found by M. L. Eaton [Ann. math. Statistics 41, 528-538 (1970; Zbl 0195.201)] and by the reviewer [Ann. of Statist. 4, 779-787 (1976; Zbl 0336.62052)], for missing observations, is sufficient. Finally the book also contains the Kalman filter and a geometric approach to generalized inverses. Besides theorem 1 on page 49 the book does not deal with the estimation of \(\sigma^ 2\) or the optimal estimation of variance components.
This is a very valuable introductory book to linear estimation. This is also due to the exercises after each paragraph. Many deep, mainly geometric, considerations contribute to a better understanding of the material. Though geometric views are every individual’s own sake the 60 illustrations in the book may be useful for the reader who is willing to learn the author’s art of drawing.
Reviewer: H.Drygas

MSC:

62J05 Linear regression; mixed models
62H12 Estimation in multivariate analysis
62-01 Introductory exposition (textbooks, tutorial papers, etc.) pertaining to statistics
62F15 Bayesian inference
62M20 Inference from stochastic processes and prediction