×

Reverses of the Jensen inequality in terms of first derivative and applications. (English) Zbl 1280.26033

Summary: Two new reverses of the celebrated Jensen integral inequality for convex functions with applications for means, the Hölder inequality and \(f\)-divergence measures in information theory are given.

MSC:

26D15 Inequalities for sums, series and integrals
26D20 Other analytical inequalities
94A05 Communication theory
26E60 Means
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] Ali, S.M., Silvey, S.D.: A general class of coefficients of divergence of one distribution from another. J. R. Stat. Soc., Ser. B, Stat. Methodol. 28, 131–142 (1966) · Zbl 0203.19902
[2] Bhattacharyya, A.: On a measure of divergence between two statistical populations defined by their probability distributions. Bull. Calcutta Math. Soc. 35, 99–109 (1943) · Zbl 0063.00364
[3] Bethbassat, M.: f-entropies, probability of error, and feature selection. Inf. Control 39, 227–242 (1978) · Zbl 0394.94011 · doi:10.1016/S0019-9958(78)90587-9
[4] Burbea, I., Rao, C.R.: On the convexity of some divergence measures based on entropy function. IEEE Trans. Inf. Theory 28(3), 489–495 (1982) · Zbl 0479.94009 · doi:10.1109/TIT.1982.1056497
[5] Chen, C.H.: Statistical Pattern Recognition. Hoyderc Book Co., Rocelle Park (1973)
[6] Chow, C.K., Lin, C.N.: Approximating discrete probability distributions with dependence trees. IEEE Trans. Inf. Theory 14(3), 462–467 (1968) · Zbl 0165.22305 · doi:10.1109/TIT.1968.1054142
[7] Csiszár, I.: Information-type measures of difference of probability distributions and indirect observations. Studia Sci. Math. Hung. 2, 299–318 (1967) · Zbl 0157.25802
[8] Csiszár, I.: On topological properties of f-divergences. Studia Sci. Math. Hung. 2, 329–339 (1967) · Zbl 0157.25803
[9] Csiszár, I., Körner, J.: Information Theory: Coding Theorem for Discrete Memoryless Systems. Academic Press, New York (1981) · Zbl 0568.94012
[10] Dragomir, S.S.: Bounds for the deviation of a function from the chord generated by its extremities. Bull. Aust. Math. Soc. 78(2), 225–248 (2008) · Zbl 1154.26024 · doi:10.1017/S0004972708000671
[11] Dragomir, S.S.: A converse result for Jensen’s discrete inequality via Gruss’ inequality and applications in information theory. An. Univ. Oradea Fasc. Mat. 7, 178–189 (1999/2000) · Zbl 1062.26015
[12] Dragomir, S.S.: On a reverse of Jessen’s inequality for isotonic linear functionals. J. Inequal. Pure Appl. Math. 2(3), 36 (2001) · Zbl 0994.26013
[13] Dragomir, S.S.: A Grüss type inequality for isotonic linear functionals and applications. Demonstr. Math. 36(3), 551–562 (2003). Preprint RGMIA Res. Rep. Coll. 5(2002), Supplement, Art. 12. [Online http://rgmia.org/v5(E).php ] · Zbl 1036.26021
[14] Dragomir, S.S., Ionescu, N.M.: Some converse of Jensen’s inequality and applications. Rev. Anal. Numér. Théor. Approx. 23(1), 71–78 (1994) · Zbl 0836.26009
[15] Gokhale, D.V., Kullback, S.: The Information in Contingency Tables. Marcel Decker, New York (1978) · Zbl 0405.62002
[16] Havrda, J.H., Charvat, F.: Quantification method classification process: concept of structural {\(\alpha\)}-entropy. Kybernetika 3, 30–35 (1967) · Zbl 0178.22401
[17] Hellinger, E.: Neue Bergrüirdung du Theorie quadratisher Formerus von uneudlichvieleu Veränderlicher. J. reine Augeur. Math. 36, 210–271 (1909) · JFM 40.0393.01
[18] Jeffreys, H.: An invariant form for the prior probability in estimating problems. Proc. R. Soc. Lond. Ser. A, Math. Phys. Sci. 186, 453–461 (1946) · Zbl 0063.03050 · doi:10.1098/rspa.1946.0056
[19] Kadota, T.T., Shepp, L.A.: On the best finite set of linear observables for discriminating two Gaussian signals. IEEE Trans. Inf. Theory 13, 288–294 (1967) · Zbl 0153.48704
[20] Kailath, T.: The divergence and Bhattacharyya distance measures in signal selection. IEEE Trans. Commun. 15(1), 52–60 (1967) · doi:10.1109/TCOM.1967.1089532
[21] Kapur, J.N.: A comparative assessment of various measures of directed divergence. Adv. Manage. Stud. 3, 1–16 (1984)
[22] Kazakos, D., Cotsidas, D.: A decision theory approach to the approximation of discrete probability densities. IEEE Trans. Pattern Anal. Mach. Intell. 2(1), 61–67 (1980) · Zbl 0438.62013 · doi:10.1109/TPAMI.1980.4766971
[23] Kullback, S., Leibler, R.A.: On information and sufficiency. Ann. Math. Stat. 22, 79–86 (1951) · Zbl 0042.38403 · doi:10.1214/aoms/1177729694
[24] Lin, J.: Divergence measures based on the Shannon entropy. IEEE Trans. Inf. Theory 37(1), 145–151 (1991) · Zbl 0712.94004 · doi:10.1109/18.61115
[25] Mei, M.: The theory of genetic distance and evaluation of human races. Jpn. J. Hum. Genet. 23, 341–369 (1978) · doi:10.1007/BF01908190
[26] Niculescu, C.P.: An extension of Chebyshev’s inequality and its connection with Jensen’s inequality. J. Inequal. Appl. 6(4), 451–462 (2001) · Zbl 1002.26016
[27] Pielou, E.C.: Ecological Diversity. Wiley, New York (1975)
[28] Rao, C.R.: Diversity and dissimilarity coefficients: a unified approach. Theor. Popul. Biol. 21(1), 24–43 (1982) · Zbl 0516.92021 · doi:10.1016/0040-5809(82)90004-1
[29] Rényi, A.: On measures of entropy and information. In: Proc. Fourth Berkeley Symp. on Math. Statist. and Prob. (Univ. of Calif. Press), vol. 1, pp. 547–561 (1961)
[30] Roberts, A.W., Varberg, D.E.: Convex Functions. Academic Press, New York (1973) · Zbl 0271.26009
[31] Sen, A.: On Economic Inequality. Oxford University Press, London (1973) · Zbl 0289.20039
[32] Sharma, B.D., Mittal, D.P.: New non-additive measures of relative information. J. Comb. Inf. Syst. Sci. 2(4), 122–132 (1977) · Zbl 0439.94006
[33] Shioya, H., Da-Te, T.: A generalisation of Lin divergence and the derivative of a new information divergence. Electron. Commun. Jpn. 78(7), 37–40 (1995) · doi:10.1002/ecjc.4430780704
[34] Taneja, I.J.: Generalised Information Measures and Their Applications (2001). Universidade Federal de Santa Catarina. http://www.mtm.ufsc.br/\(\sim\)taneja/bhtml/bhtml.html
[35] Topsoe, F.: Some inequalities for information divergence and related measures of discrimination. IEEE Trans. Inf. Theory 46(4), 1602–1609 (2000) · Zbl 1003.94010 · doi:10.1109/18.850703
[36] Theil, H.: Economics and Information Theory. North-Holland, Amsterdam (1967)
[37] Theil, H.: Statistical Decomposition Analysis. North-Holland, Amsterdam (1972) · Zbl 0263.62066
[38] Vajda, I.: Theory of Statistical Inference and Information. Kluwer Academic, Dordrecht (1989) · Zbl 0711.62002
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.