×

Bayesian backfitting. (With comments and a rejoinder). (English) Zbl 1059.62524

Summary: We propose general procedures for posterior sampling from additive and generalized additive models. The procedure is a stochastic generalization of the well-known backfitting algorithm for fitting additive models. One chooses a linear operator (”smoother”) for each predictor, and the algorithm requires only the application of the operator and its square root. The procedure is general and modular, and we describe its application to nonparametric, semiparametric and mixed models.

MSC:

62F15 Bayesian inference
62J12 Generalized linear models (logistic models)
65C60 Computational problems in statistics (MSC2010)

Software:

ARC; BayesDA
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] Ansley, C. and Kohn, R. (1985). Estimation, filtering and smoothing in state space models with diffuse initial conditions. Ann. Statist. 13 1286-1316. · Zbl 0586.62154 · doi:10.1214/aos/1176349739
[2] Buja, A., Hastie, T. and Tibshirani, R. (1989). Linear smoothers and additive models (with discussion). Ann. Statist. 17 453- 555. · Zbl 0689.62029 · doi:10.1214/aos/1176347115
[3] Carter, C. and Kohn, R. (1994). On Gibbs sampling for state space models. Biometrika 81 541-553. JSTOR: · Zbl 0809.62087 · doi:10.1093/biomet/81.3.541
[4] Chambers, J. and Hastie, T. (1991). Statistical Models in S. Wadsworth/Brooks Cole, Pacific Grove, CA. · Zbl 0776.62007
[5] de Boor, C. (1978). A Practical Guide to Splines. Springer, New York. · Zbl 0406.41003
[6] Denison, D., Mallick, B. and Smith, A. (1998). Automatic Bayesian curve fitting. J. Roy. Statist. Soc. Ser. B 60 333- 350. JSTOR: · Zbl 0907.62031 · doi:10.1111/1467-9868.00128
[7] Gelfand, A. E. and Smith, A. F. M. (1990). Sampling based approaches to calculating marginal densities. J. Amer. Statist. Assoc. 85 398-409. JSTOR: · Zbl 0702.62020 · doi:10.2307/2289776
[8] Gelman, A., Carlin, J., Stern, H. and Rubin, D. (1995). Bayesian Data Analysis. CRC Press, Boca Raton, FL. · Zbl 1279.62004
[9] Geman, S. and Geman, D. (1984). Stochastic relaxation, Gibbs distributions and the Bayesian restoration of images. IEEE Trans. Pattern Anal. Machine Intelligence 6 721-741. · Zbl 0573.62030 · doi:10.1109/TPAMI.1984.4767596
[10] Green, P. and Silverman, B. (1994). Nonparametric Regression and Generalized Linear Models: A Roughness Penalty Approach. Chapman and Hall, London. · Zbl 0832.62032
[11] Hastie, T. (1995). Pseudosplines. J. Roy. Statist. Soc. Ser. B 58 379-396. JSTOR: · Zbl 0853.62035
[12] Hastie, T. and Stuetzle, W. (1989). Principle curves. J. Amer. Statist. Assoc. 84 502-516. JSTOR: · Zbl 0679.62048 · doi:10.2307/2289936
[13] Hastie, T. and Tibshirani, R. (1986). Generalized additive models. Statist. Sci. 1 295-318. · Zbl 0645.62068 · doi:10.1214/ss/1177013604
[14] Hastie, T. and Tibshirani, R. (1990). Generalized Additive Models. Chapman and Hall, London. · Zbl 0747.62061
[15] Hastings, W. K. (1970). Monte Carlo sampling methods using Markov chains and their applications. Biometrika 57 97- 109. · Zbl 0219.65008 · doi:10.1093/biomet/57.1.97
[16] Hobert, J. and Casella, G. (1996). The effect of improper priors on Gibbs sampling in hierarchical linear mixed models. J. Amer. Statist. Assoc. 91 1461-1473. JSTOR: · Zbl 0882.62020 · doi:10.2307/2291572
[17] Hodges, J. and Sargent, D. (1998). Counting degrees of freedom in hierarchical and other richly parametrized models. Technical report, Div., Biostatistics, Univ. Minnesota. · Zbl 0984.62045
[18] Holmes, C. and Mallick, B. (1997). Bayesian wavelet networks for nonparametric regression. IEEE. Trans. Neural Networks.
[19] Laird, N. M. and Ware, J. H. (1982). Random-effects models for longitudinal data. Biometrics 38 963-974. · Zbl 0512.62107 · doi:10.2307/2529876
[20] Lin, X. and Zhang, D. (1997). Inference in generalized additive mixed models. Technical report, Biostatistics, Dept., Univ. Michigan.
[21] Liu, J. S., Wong, W. H. and Kong, A. (1994). Covariance structure of the Gibbs sampler with applications to the comparisons of estimators and augmentation schemes. Biometrika 81 27-40. JSTOR: · Zbl 0811.62080 · doi:10.1093/biomet/81.1.27
[22] Neal, R. M. (1996). Bayesian Learning for Neural Networks. Springer, New York. · Zbl 0888.62021
[23] O’Hagan, A. (1978). Curve fitting and optimal design for regression (with discussion). J. Roy. Statist. Soc. Ser. B 40 1-42. JSTOR: · Zbl 0374.62070
[24] Silverman, B. (1984). Spline smoothing: the equivalent kernel method. Ann. Statist. 12 898-9164. · Zbl 0547.62024 · doi:10.1214/aos/1176346710
[25] Smith, M., Wong, C. and Kohn, R. (1998). Additive nonparametric regression with autocorrelated errors. J. Roy. Statist. Soc. Ser. B 60 311-332. JSTOR: · Zbl 0909.62042 · doi:10.1111/1467-9868.00127
[26] Speed, T. (1991). Comment on ”That BLUP is a good thing: the estimation of random effects.” Statist. Sci. 6 42-44. Spiegelhalter, D., Best, N., Gilks, W. and Inskip, H. · Zbl 0955.62500 · doi:10.1214/ss/1177011926
[27] . Hepatitis B: a case study in mcmc methods. In Markov Chain Monte Carlo in Practice (W. Gilks, S. Richardson and D. Spegelhalter, eds.) Chapman and Hall, London. · Zbl 0840.92012
[28] Tierney, L. (1994). Markov chains for exploring posterior distributions (with discussion). Ann. Statist. 22 1701-1762. · Zbl 0829.62080 · doi:10.1214/aos/1176325750
[29] Wahba, G. (1980). Spline bases, regularization, and generalized cross-validation for solving approximation problems with large quantities of noisy data. In Proceedings of the International Conference on Approximation Theory in Honour of George Lorenz. Academic Press, Austin, TX. · Zbl 0485.41012
[30] Wahba, G. (1990). Spline Models for Observational Data. SIAM, Philadelphia. · Zbl 0813.62001
[31] Williams, C. and Rasmussen, C. (1996). Gaussian processes for regression. In Neural Information Processing Systems 8 (D. S. Touretzky, M. C. Mozer and M. E. Hasselmo, eds.) MIT Press.
[32] Wong, C. and Kohn, R. (1996). A Bayesian approach to estimating and forecasting additive nonparametric autoregressive models. J. Time Ser. Anal. 17 203-220. · Zbl 0845.62068 · doi:10.1111/j.1467-9892.1996.tb00273.x
[33] Zeger, S. and Karim, M. (1991). Generalized linear models with random effects: a Gibbs sampling approach. J. Amer. Statist. Assoc. 86 79-86. JSTOR: · doi:10.2307/2289717
[34] Besag, J., Green, P., Higdon, D. and Mengersen, K. (1995). Bayesian computation and stochastic systems (with discussion). Statist. Sci. 10 3-66. · Zbl 0955.62552 · doi:10.1214/ss/1177010123
[35] Bowman, A. and Young, S. (1996). Graphical comparison of nonparametric curves. Appl. Statist. 45 83-98. · Zbl 0858.62003 · doi:10.2307/2986225
[36] Casella, G. and George, E. (1992). Explaining the Gibbs sampler. Amer. Statist. 46 167-174. JSTOR: · Zbl 04510859 · doi:10.2307/2685208
[37] Cook, R. D. (1993). Exploring partial residual plots. Technometrics 35 351-362. JSTOR: · Zbl 0800.62018 · doi:10.2307/1270269
[38] Cook, R. D. (1994). Using dimension-reduction subspaces to identify important inputs in models of physical systems. In Proceedings of the Section on Physical and Engineering Sciences Amer. Statist. Assoc., 18-25. Alexandria, VA.
[39] Cook, R. D. (1995). Graphics for studying net effects of regression predictors. Statist. Sinica 5 689-708. · Zbl 0824.62063
[40] Cook, R. D. (1998). Regression Graphics: Ideas for Studying Regressions Through Graphics. Wiley, New York. · Zbl 0903.62001
[41] Cook, R. D. and Lee, H. (2000). Dimension reduction in binary response regression. J. Amer. Statist. Assoc. To appear. JSTOR: · Zbl 1072.62619 · doi:10.2307/2669934
[42] Cook, R. D. and Weisberg, S. (1991). Discussion of ”Sliced inverse regression for dimension reduction.” J. Amer. Statist. Assoc. 86 316-342. JSTOR: · Zbl 0742.62044 · doi:10.2307/2290563
[43] Cook, R. D. and Weisberg, S. (1997). Graphics for assessing the adequacy of regression models. J. Amer. Statist. Assoc. 92 490-499. JSTOR: · Zbl 0890.62051 · doi:10.2307/2965698
[44] Cook, R. D. and Weisberg, S. (1999). Applied Regression Including Computing and Graphics. Wiley, New York. · Zbl 0928.62045
[45] Diggle, P. J., Tawn, J. A. and Moyeed, R. A. (1998). Modelbased geostatistics (with discussion). J. Roy. Statist. Soc. Serv. C 47 299-350. JSTOR: · Zbl 0904.62119 · doi:10.1111/1467-9876.00113
[46] Gelfand, A. E. and Sahu, S. K. (1999). Identifiability, improper priors and Gibbs sampling for generalized linear models. J. Amer. Statist. Assoc. 94 247-253. JSTOR: · Zbl 1072.62611 · doi:10.2307/2669699
[47] Gelfand, A. E., Sahu, S. K. and Carlin, B. P. (1995). Efficient parametrisations for normal linear mixed models. Biometrika 82 479-488. JSTOR: · Zbl 0832.62064 · doi:10.1093/biomet/82.3.479
[48] Geman, S. and McClure, D. E. (1985). Bayesian image analysis: an application to single photon emission tomography. In Proceedings of the Statistical Computing Section 12-18. Amer. Statist. Assoc., Alexandria, VA.
[49] Green, P. J. (1990). Bayesian reconstructions from emission tomography data using a modified EM algorithm. IEEE Trans. Medical Imaging 9 84-93.
[50] Heikkinen, J. and Arjas, E. (1998). Nonparametric Bayesian estimation of a spatial Poisson intensity. Scand. J. Statist. 25 435-450. · Zbl 0921.62034 · doi:10.1111/1467-9469.00114
[51] Li, K.-C. (1991). Sliced inverse regression for dimension reduction (with discussion). J. Amer. Statist. Assoc. 86 316-342. JSTOR: · Zbl 0742.62044 · doi:10.2307/2290563
[52] Lin, X. and Zhang, D. (1999). Mixed inference in generalized additive models. J. Roy. Statist. Soc. Ser. B 61 381-400. JSTOR: · Zbl 0915.62062 · doi:10.1111/1467-9868.00183
[53] Lindley, D. V. (1971). The estimation of many parameters (with discussion). In Foundations of Statistical Inference. (V. P. Godambe and D. A. Sprott, eds.) 435-452. Holt, Rinehart and Winston, Toronto.
[54] M üller, P., Erkanli, A. and West, M. (1996). Bayesian curve fitting using multivariate normal mixtures. Biometrika 83 67-79. · Zbl 0865.62029 · doi:10.1093/biomet/83.1.67
[55] Porzio, G. C. and Weisberg, S. (1999). Tests for lack-of-fit of regression models. Technical report 634, School Statistics, Univ. Minnesota.
[56] Roberts, G. O and Sahu, S. K. (1997). Updating schemes, correlation structure, blocking and parameterisation for the Gibbs sampler. J. Roy. Statist. Soc. Ser. B 59 291-317. JSTOR: · Zbl 0886.62083 · doi:10.1111/1467-9868.00070
[57] Roberts, G. O. and Tweedie, R. L. (1996). Geometric convergence and central limit theorems for multidimensional Hastings and Metropolis algorithms. Biometrika 83 95-110. JSTOR: · Zbl 0888.60064 · doi:10.1093/biomet/83.1.95
[58] Wahba, G. (1983). Bayesian confidence intervals for the crossvalidated smoothing spline. J. Roy. Statist. Soc. Ser. B 45 133-150. JSTOR: · Zbl 0538.65006
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.