History

Please fill in your query. A complete syntax description you will find on the General Help page.
On generalized entropies, Bayesian decisions and statistical diversity. (English)
Kybernetika 43, No. 5, 675-696 (2007).
Let $X$ be a random discrete variable with distribution $p = (p(i):i \in {\cal I})$ with finite ${\cal I}$. The first author [Theory of statistical inference and information. Theory and Decision Library, Series B: Mathematical and Statistical Methods, 11. (Dordrecht) etc.: Kluwer Academic Publishers. (1989; Zbl 0711.62002)] studied $ψ$ -entropies $H_ψ(p) \equiv H_ψ(X) = \sum\limits_x {p(x)ψ(p(x))}$ , where $ψ$ is a decreasing continuous function on $(0,1]$ and $ψ(1) = 0$. The power entropies $H_α(X)$ [{\it J. Havrda, F. Charvat}, Kybernetika, Praha 3, 30‒35(1967; Zbl 0178.22401)] are obtained using for $ψ$ the power function $ψ_α(π) = (1 - π^{α- 1} )/(α- 1)$ , $α\in \mathbb{R}$ , where $$ψ_α(0) = \mathop {\lim }\limits_{π\downarrow 0} ψ_α(π)$$ for $α\ne 1$ and $ψ_1 (π) = - \ln π$ , i.e $α= 1$ gives the Shannon entropy; another important subcase is the quadratic entropy $H_2 (X)$. It is shown that generalized entropies of information sources as generalized informations in direct observations lead to nonconcave entropies, in particular infinitely many nonconcave power entropies. Relations between the entropies $H_α$, $α\ge 0$ and the errors of Bayesian decisions about $X$ are investigated; it is shown that the quadratic entropy provides estimates which are in average more than 100 and $H_2 (X)$ .
Reviewer: Jaak Henno (Tallinn)