×

An introduction to the bootstrap. (English) Zbl 0835.62038

Monographs on Statistics and Applied Probability. 57. New York, NY: Chapman & Hall. xvi, 436 p. (1993).
In statistical practice, after the data have been collected and a certain statistical measure has been used to analyze and summarize the data, it is important to know how accurate this data summary is. The bootstrap is a recently developed technique for assessing statistical accuracy. It requires modern computer power to simplify the often intricate calculations of traditional statistical theory.
This book provides an elementary introduction to the basic ideas and applications of the bootstrap through the analysis of real data sets. After explaining the basic idea of bootstrap in a simple case, estimation of the standard error of a sample mean, reviewing some basic concepts in elementary probability and statistics, the plug-in principle and the general definition of standard error, the authors formally introduce the general bootstrap algorithm for the estimation of standard errors in Chapter 6. This algorithm is illustrated through a series of practical examples in subsequent chapters, which include the analysis of test score data, involving techniques of multivariate analysis, curve fitting through nonparametric regression methods, two sample problems, time series analysis and general linear regression estimation. Chapter 10 introduces the bootstrap estimate for the bias, another measure of statistical accuracy. The jackknife, an older technique for estimating biases and standard errors, is mentioned in this chapter but discussed in greater details in Chapter 11. In Chapters 12-14, the authors describe different techniques for constructing confidence intervals using the bootstrap. After introducing permutation tests and a simple computer- intensive technique avoiding mathematical assumptions in Chapter 15, some bootstrap algorithms of hypothesis testing are presented in Chapter 16. Chapters 17 and 18 deal with applications of the bootstrap to two other important problems in statistics: the estimation of prediction error and selection of tuning parameters, both useful for model selection. The method of cross-validation, a variation of the jackknife, is also introduced in these two chapters.
The rest of the book deals with some specific problems in bootstrap methodology. Chapter 19 discusses the jackknife-after-bootstrap, a simple method for estimating the variability from a set of bootstrap estimates. Chapter 20 offers a geometrical representation for the bootstrap and jackknife, which can be used to understand connections and differences between these two methods. The relationship of bootstrap and jackknife to the more traditional maximum likelihood method is explained in Chapter 21. A heuristic description of the theory of bootstrap confidence intervals is given in Chapter 22. Chapter 23 discusses how to efficiently calculate bootstrap estimates. A number of different methods for obtaining approximate nonparametric likelihoods are introduced in Chapter 24. Chapter 25 presents a case study for the application of the bootstrap to bioequivalence studies in drug developments. The book is concluded by a discussion chapter on the evolution of bootstrap methodology, some general questions about the use of the bootstrap, and references on further topics. An appendix which describes software for bootstrap computations is also given at the end of the book.
In summary, this book provides a clear introduction to the basic ideas and many applications of the bootstrap and other related methods, such as jackknife, cross-validation and nonparametric likelihood. The mathematical level required to understand the major parts of the book is very low, just some basic concepts in probability theory and statistics, and some knowledge of elementary algebra. Because real data sets have been used throughout the book to illustrate the methodology, the readers can easily adapt the methods to their own problems.
For readers who want to know more about the theory behind the jackknife and bootstrap, and applications of the jackknife and bootstrap in some other areas, such as sample surveys, longitudinal data and survival analysis, dependent data models a.s.o., a recent book by J. Shao and the reviewer [The jackknife and bootstrap. (1995)]may be more helpful. More mathematically advanced accounts of the bootstrap based on Edgeworth expansions can be found in P. Hall, The bootstrap and Edgeworth expansion. (1992; Zbl 0744.62026).
Reviewer: D.Tu (Ottawa)

MSC:

62G09 Nonparametric statistical resampling methods
62-01 Introductory exposition (textbooks, tutorial papers, etc.) pertaining to statistics
65C99 Probabilistic methods, stochastic differential equations

Citations:

Zbl 0744.62026

Software:

bootstrap
PDFBibTeX XMLCite