Csiszár, Imre Axiomatic characterizations of information measures. (English) Zbl 1179.94043 Entropy 10, No. 3, 261-273 (2008). Summary: Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized information measures are surveyed. Three directions are treated: (A) Characterization of functions of probability distributions suitable as information measures. (B) Characterization of set functions on the subsets of \(\{1;\dots,N\}\) representable by joint entropies of components of an \(N\)-dimensional random vector. (C) Axiomatic characterization of MaxEnt and related inference rules. The paper concludes with a brief discussion of the relevance of the axiomatic approach for information theory. Cited in 1 ReviewCited in 60 Documents MSC: 94A17 Measures of information, entropy Keywords:Shannon entropy; Kullback I-divergence; Rényi information measures; \(f\)- divergence; \(f\)-entropy; functional equation; proper score; maximum entropy; transitive inference rule; Bregman distance Software:ITIP PDFBibTeX XMLCite \textit{I. Csiszár}, Entropy 10, No. 3, 261--273 (2008; Zbl 1179.94043) Full Text: DOI