×

On two forms of Fisher’s measure of information. (English) Zbl 1073.62003

Summary: Fisher’s information number is the second moment of the “score function” where the derivative is with respect to \(x\) rather than \(\Theta\). It is Fisher’s information for a location parameter, and also called shift-invariant Fisher information. In recent years, Fisher’s information number has been frequently used in several places regardless of parameters of the distribution or of their nature. Is this number a nominal, standard, and typical measure of information?
The Fisher information number is examined in light of the properties of classical statistical information theory. It has some properties analogous to those of Fisher’s measure, but, in general, it does not have good properties if used as a measure of information when \(\Theta\) is not a location parameter. Even in the case of location parameters, regularity conditions must be satisfied. It does not possess the two fundamental properties of the mother information, namely the monotonicity and invariance under sufficient transformations. Thus the Fisher information number should not be used as a measure of information (except when \(\Theta\) a location parameter). On the other hand, Fisher’s information number, as a characteristic of a distribution \(f(x)\), has other interesting properties. As a byproduct of its superadditivity property a new coefficient of association is introduced.

MSC:

62B10 Statistical aspects of information-theoretic topics
62F10 Point estimation
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] Brown, L. (1982). A proof of the central limit theorem motivated by Cramer-Rao inequality. In: Kallianpur, G., Krishnaiah, P. R., Ghosh, J. K., eds. <i>Statistics and Probability: Essays in Honor of C. R. Rao</i> . Amsterdam: North-Holland Publishing Co.. 141–148.
[2] Carlen, E. E. (1991). Superadditivity of Fisher’s information and logarithmic Sobolev inequalities. <i>J. Functional Anal.</i> 101:194–211. · Zbl 0732.60020
[3] Cassela, G., Berger, R. (2002). <i>Statistical Inference</i> . Belmont, California: Duxbury Press.
[4] DeCani, J. S., Stine, R. A. (1986). A note on deriving the information matrix for a logistic distribution. <i>Amer. Statistician</i> 40:220–222.[CSA]
[5] Efron, B. F., Hinkley, D. V. (1978). Assessing the accuracy of the maximum likelihood estimator: observed versus expected Fisher information. <i>Biometrika</i> 65:457–487. · Zbl 0401.62002
[6] Ferentinos, K., Papaioannou, T. (1981). New parametric measures of information. <i>Information and Control</i> 51:193–208.[CROSSREF] · Zbl 0524.62005
[7] Frieden, B. R. (1988). Applications to optics and wave mechnics of the criterion of maximum Cramer–Rao bound. <i>J. Modern Optics</i> 35:1297–1316. · Zbl 0941.94500
[8] Frieden, B. R. (1998). <i>Physics from Fisher Information—A Unification</i> . Cambridge, UK: Cambridge University Press. · Zbl 0998.81512
[9] Itoh, Y. (1989). An application of the convolution inequality for the Fisher information. <i>Ann. Instit. Statist. Math.</i> 41(1):9–12. · Zbl 0693.62010
[10] Kagan, A. (2001). A discrete version of the Stam inequality and a characterization of the Poisson distribution. <i>J. Statist. Plann. Inference</i> 92:7–12.[CROSSREF] · Zbl 0964.62013
[11] Kagan, A., Landsman, Z. (1997). Statistical meaning of Carlen’s supperadditivity of the Fisher information. <i>Statist. Probab. Lett.</i> 32:175–179.[CROSSREF] · Zbl 0874.60002
[12] Kagan, A., Landsman, Z. (1999). Relation between the covariance and Fisher information matrices. <i>Statist. Probab. Lett.</i> 42:7–13.[CROSSREF] · Zbl 0952.62049
[13] Kapur, J. N., Dhande, M. (1986). On the entropic measures of stochastic dependence. <i>Ind. J. Pure Appl. Math.</i> 17:581–595. · Zbl 0585.62102
[14] Lehmann, E. L., Casella, G. (1998). <i>Theory of Point Estimation</i> . 2nd edn. New York: Springer-Verlag. · Zbl 0916.62017
[15] Papaioannou, T. (1985). Measures of information. In: Kotz, S. S., Johnson, N. L., eds. <i>Encyclopedia of Statistical Sciences</i> . Vol. 5, New York: John Wiley and Sons.
[16] Papaioannou, T. (2001). On distances and measures of information: a case of diversity. In: Charalambides, C. A., Koutras, M. V., Balakrishnan, N., eds. <i>Probability and Statistical Models with Applications</i> . London: Chapman and Hall/CRC. 503–515.
[17] Papathanasiou, V. (1993a). An extension of the information inequality and related characterizations. <i>Statist. Probab. Lett.</i> 18:27–32.[CROSSREF] · Zbl 0779.62013
[18] Papathanasiou, V. (1993b). Some characteristic properties of the Fisher information matrix via Cacoullos-type inequalities. <i>J. Multivariate Anal.</i> 44:256–265.[CROSSREF] · Zbl 0765.62055
[19] Rao, B. R. (1958). On an analogue of Cramer-Rao inequality. <i>Skand. Actuar. Tidskr</i> 41:57–64.
[20] Rothenberg, T. J. (1971). Identification of parametric models. <i>Econometrika</i> 39:577–591. · Zbl 0231.62081
[21] Sankaran, M. (1964). On an analogue of Bhattacharya bound. <i>Biometrika</i> 51:268–270. · Zbl 0126.14904
[22] Stam, A. (1959). Some inequalities satisfied by the quantities of information of Fisher and Shannon. <i>Information and Control</i> 2:101–112.[CROSSREF] · Zbl 0085.34701
[23] Tambakis, D. (2001). Multivariate dependence at short horizons: asymmetric correlation form Fisher information. Mimeo, UK: Pembroke College, University of Cambridge, Cambridge.
[24] Zacks, S. (1971). <i>The Theory of Statistical Inference</i> . New York: John Wiley and Sons, Inc..
[25] Zografos, K. (1998). On a measure of dependence based on Fisher’s information matrix. <i>Commun. Statist. Theory. Meth.</i> 27(7):1715–1728. · Zbl 0930.62063
[26] Zografos, K. (2000). Measures of multivariate dependence based on a distance between Fisher information matrices. <i>J. Statist. Plann. Inference</i> 89:91–107.[CROSSREF] · Zbl 0954.62076
[27] Zografos, K., Ferentinos, K., Papaioannou, T. (1989). Limiting properties of some measures of information. <i>Ann. Instit. Statist. Math.</i> 41(3):451–460.[CROSSREF] · Zbl 0725.62006
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.