Taneja, I. J.; Pardo, L.; Morales, D.; Menéndez, M. L. On generalized information and divergence measures and their applications: a brief review. (English) Zbl 1167.94327 Qüestiió 13, No. 1-3, 47-73 (1989). Summary: The aim of this review is to give different two-parametric generalizations of the following measures: directed divergence [S. Kullback and R. A. Leibler [Ann. Math. Stat. 22, 79–86 (1959; Zbl 0042.38403)], Jensen difference divergence [J. Burbea and C. R. Rao, J. Multivariate Anal. 12, No. 4, 575–596 (1982; Zbl 0526.60015), IEEE Trans. Inf. Theory 28, 489–495 (1982; Zbl 0479.94009); C. R. Rao, Theor. Popul. Biol. 21, 24–43 (1982; Zbl 0516.92021)] and Jeffreys’ invariant divergence [H. Jeffreys, Proc. R. Soc. Lond., Ser. A 186, 453–461 (1946; Zbl 0063.03050)]. These generalizations are put in the unified expression and their properties are studied. The applications of generalized information and divergence measures to comparison of experiments and the connections with Fisher information measure are also given. Cited in 9 Documents MSC: 94A17 Measures of information, entropy 62B10 Statistical aspects of information-theoretic topics 62B15 Theory of statistical experiments Keywords:Shannon entropy; generalized information and divergence measures; inequalities; comparison of experiments; Fisher information Citations:Zbl 0042.38403; Zbl 0526.60015; Zbl 0479.94009; Zbl 0516.92021; Zbl 0063.03050 PDFBibTeX XMLCite \textit{I. J. Taneja} et al., Qüestiió 13, No. 1--3, 47--73 (1989; Zbl 1167.94327) Full Text: EuDML Link