\input zb-basic \input zb-matheduc \iteman{ZMATH 2014b.00584} \itemau{Goodaire, Edgar G.} \itemti{Linear algebra. Pure and applied.} \itemso{Hackensack, NJ: World Scientific (ISBN 978-981-4508-36-0/hbk; 978-981-4508-37-7/pbk). xvi, 716~p. (2014).} \itemab A rough outline of this text is provided by the chapter headings: 1. Euclidean $n$-space; 2. Matrices and linear equations; 3. Determinants, eigenvalues and eigenvectors; 4. Vector spaces; 5. Linear transformations; 6. Orthogonality; 7. The spectral theorem. The first three chapters could provide a one-semester first course and the remainder provides a course for a second semester. A natural question about a linear algebra text is: how does it differ from the many other texts available? This review first describes points about the content and then some comments about the presentation. Throughout the book the emphasis is on the standard space $\Bbb{R}^{n}$ and its subspaces and matrices over the reals (complex matrices are considered in the final chapter and a section in Chapter 2 deals with error correcting codes over $\Bbb{Z}_{2}$). Concepts such as linear independence, subspace, basis, dimension, and linear transformation are considered in this context. Beginning with Chapter 4, the author gives a more general definition of a vector space and gives few examples of other vector spaces. He frames his later definitions and proofs in terms of general vector spaces, but rarely applies them to any space which is not a subspace of $\Bbb{R}^{n}$ (or $\Bbb{C}^{n}$). Within this context definitions and proofs are given and explained clearly with lots of supporting examples worked out in detail. The proof of associativity of matrix multiplication is explained as a consequence of functional composition; $LU$-factorization is introduced in Chapter 2 in terms of elementary matrices and then used in Chapter 3 to prove properties of the determinant; and orthogonal projections are introduced early and are later used to explain the geometry underlying least squares methods and pseudoinverses. In the final chapter, Schur's unitary triangularization theorem is proved and applied to the diagonalization of Hermitian matrices and to prove the singular value decomposition. Also, a series of well-written application'' sections should convince the reader that linear algebra is useful: error-correcting codes, analysis of electrical ciruits; linear recurrence relations; Markov chains; computer graphics (affine transformations); least squares approximations; quadratic forms and conic sections. The style of writing is unhurried and careful. Many of the pages are taken up with detailed examples and worked out exercises. Some readers may prefer a leaner style, and find this a little overwhelming, but the author has a clear audience in mind and each of his examples illustrates a specific point. There are also many exercises for the reader (often with answers at the back of the book). Many of these are computational but not necessarily routine, and include questions which will test the student's understanding. Throughout the book there are reading checks (simple questions to test the comprehension of the material just presented) and true/false questions (decide, with as little calculation as possible,\dots'') which should keep the reader involved. Although the author recommends using computational algebra systems for serious computations, he wants to see the student to be able to carry out the basic algorithms by hand for small examples. The book has three appendices: an introduction to complex numbers; how to write a proof; and Things I must remember (a summary of the important topics covered in the book). A glossary gives an annotated list of definitions introduced in the book. In the reviewer's opinion this book could be a useful as a textbook or for self-study, particularly for mathematics students in the middle echelon. It is clear and well-written, with a reader-friendly style, but not for those who prefer their books concise. \itemrv{John D. Dixon (Ottawa)} \itemcc{H65 P25 R45 N55 K65 G75} \itemut{linear algebra; textbook; Euclidean $n$-space; matrices; linear equations; determinants; eigenvalues; eigenvectors; vector spaces; linear transformations; orthogonality; spectral theorem; linear independence; $LU$-factorization; diagonalization; Hermitian matrices; singular value decomposition; error-correcting code; electrical circuit; linear recurrence relations; Markov chains; computer graphics; least squares approximations; quadratic forms; conic sections} \itemli{doi:10.1142/8808} \end