×

Adjustment by minimum discriminant information. (English) Zbl 0583.62020

Let \({\mathcal M}\) be the set of all probability measures (p. m.) on \((R_ k,B_ k)\) and \(T:R_ k\to R_{\ell}\) measurable. Fix \(t\in R_{\ell}\) and put \(C=\{A\in {\mathcal M}:\int TdA=t\}\). Let for \(P\in {\mathcal M}\) the \(Q\in C\) be the minimal discriminant information adjusted (MDIA) p. m. of \(P\), i.e. the p. m. which is the closest one to \(P\) in the sense of Kullback-Leibler discriminant information.
Under mild conditions it is proved that for \(X_ 1,X_ 2,\ldots\), i.i.d. according to \(P\) the \(Q_ n\)- MDIA p. m. of \(P_ n\) (empirical distribution) - converge weakly to \(Q\) a. s. and that for \(D:R_ k\to R_ 1\) the \(\int D dQ_ n\) is an asymptotically unbiased and normal estimate of \(\int D dQ\).
Reviewer: J.Á.Višek

MSC:

62E99 Statistical distribution theory
62B10 Statistical aspects of information-theoretic topics
62B99 Sufficiency and information
62G05 Nonparametric estimation
PDFBibTeX XMLCite
Full Text: DOI