×

Characterizing the principle of minimum cross-entropy within a conditional-logical framework. (English) Zbl 0903.68181

Summary: The principle of minimum cross-entropy (ME-principle) is often used as an elegant and powerful tool to build up complete probability distributions when only partial knowledge is available. The inputs it may be applied to are a prior distribution \(P\) and some new information \(R\), and it yields as a result the one distribution \(P^{*}\) that satisfies \(R\) and is closest to \(P\) in an information-theoretic sense. More generally, it provides a “best” solution to the problem “How to adjust \(P\) to \(R\)?” In this paper, we show how probabilistic conditionals allow a new and constructive approach to this important principle. Though popular and widely used for knowledge representation, conditionals quantified by probabilities are not easily dealt with. We develop four principles that describe their handling in a reasonable and consistent way, taking into consideration the conditional-logical as well as the numerical and probabilistic aspects. Finally, the ME-principle turns out to be the only method for adjusting a prior distribution to new conditional information that obeys all these principles. Thus a characterization of the ME-principle within a conditional-logical framework is achieved, and its implicit logical mechanisms are revealed clearly.

MSC:

68T30 Knowledge representation
68T35 Theory of languages and software systems (knowledge-based systems, expert systems, etc.) for artificial intelligence
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] Aczél, J., (Vorlesungen über Funktionalgleichungen und ihre Anwendungen (1961), Birkhäuser: Birkhäuser Basel) · Zbl 0096.09102
[2] Adams, E. W., (The Logic of Conditionals (1975), Reidel: Reidel Dordrecht)
[3] Bacchus, F., (Representing and Reasoning with Probabilistic Knowledge: A Logical Approach to Probabilities (1990), MIT Press: MIT Press Cambridge, MA)
[4] Brewka, G., Preferred subtheories: an extended logical framework for default reasoning, (Proceedings IJCAI-89, Vol. 2 (1989), Morgan Kaufmann: Morgan Kaufmann San Mateo, CA), 1043-1048 · Zbl 0713.68053
[5] Calabrese, P. G., Deduction and inference using conditional logic and probability, (Goodman, I. R.; Gupta, M. M.; Nguyen, H. T.; Rogers, G. S., Conditional Logic in Expert Systems (1991), North-Holland: North-Holland Amsterdam), 71-100
[6] Csiszár, I., I-divergence geometry of probability distributions and minimization problems, Ann. Probab., 3, 146-158 (1975) · Zbl 0318.60013
[7] De Finetti, B., (Theory of Probability, Vols. 1 and 2 (1974), John Wiley and Sons: John Wiley and Sons New York)
[8] Dubois, D.; Prade, H., Conditional objects and non-monotonic reasoning, (Proceedings 2nd International Conference on Principles of Knowledge Representation and Reasoning (KR-91) (1991), Morgan Kaufmann: Morgan Kaufmann Los Altos, CA), 175-185 · Zbl 0765.68171
[9] Gaerdenfors, P., (Knowledge in Flux: Modeling the Dynamics of Epistemic States (1988), MIT Press: MIT Press Cambridge, MA)
[10] Geffner, H., (Default Reasoning: Causal and Conditional Theories (1992), MIT Press: MIT Press Cambridge, MA)
[11] Goldszmidt, M., Research issues in qualitative and abstract probability, AI Mag., 63-66 (Winter 1994)
[12] Goldszmidt, M.; Pearl, J., Rank-based systems: a simple approach to belief revision, belief update and reasoning about evidence and actions, (Proceedings 3rd International Conference on Principles of Knowledge Representation and Reasoning (KR-92). Proceedings 3rd International Conference on Principles of Knowledge Representation and Reasoning (KR-92), Cambridge, MA (1992)), 661-672
[13] Good, I. J., Maximum entropy for hypothesis formulation, especially for multidimensional contingency tables, Ann. Math. Statist., 34, 911-934 (1963) · Zbl 0143.40705
[14] Grove, A. J.; Halpern, J. Y.; Koller, D., Random worlds and maximum entropy, J. Artif. Intell. Res., 2, 33-88 (1994) · Zbl 0900.68398
[15] Hajek, P.; Havranek, T.; Jirousek, R., (Uncertain Information Processing in Expert Systems (1992), CRC Press: CRC Press Boca Raton, FL)
[16] Jaynes, E. T., (Papers on Probability, Statistics and Statistical Physics (1983), Reidel: Reidel Dordrecht) · Zbl 0501.01026
[17] Johnson, R. W.; Shore, J. E., Comments on and correction to “Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy”, IEEE Trans. Inform. Theory, 29, 6, 942-943 (1983) · Zbl 0532.94004
[18] Kern-Isberner, G., Conditional logics and entropy, (Informatik Fachbericht, 203 (1996), FernUniversität Hagen) · Zbl 0947.68140
[19] Kern-Isberner, G., A logically sound method for uncertain reasoning with quantified conditionals, (Proceedings 1st International Conference on Qualitative and Quantitative Practical Reasoning, ECSQARU-FAPR’97 (1997), Springer: Springer Berlin), 365-379
[20] Kraus, S.; Lehmann, D.; Magidor, M., Nonmonotonic reasoning, preferential models and cumulative logics, Artificial Intelligence, 44, 167-207 (1990) · Zbl 0782.03012
[21] Kullback, S., (Information Theory and Statistics (1968), Dover: Dover New York) · Zbl 0149.37901
[22] Lauritzen, S. L.; Spiegelhalter, D. J., Local computations with probabilities in graphical structures and their applications to expert systems, J. Roy. Statist. Soc. Ser. B, 50, 2, 415-448 (1988) · Zbl 0684.68106
[23] Lyndon, R. C.; Schu, P. E., (Combinatorial Group Theory (1977), Springer: Springer Berlin)
[24] Makinson, D., General patterns in nonmonotonic reasoning, (Gabbay, D. M.; Hogger, C. H.; Robinson, J. A., Handbook of Logic in Artificial Intelligence and Logic Programming, Vol. 3 (1994), Oxford University Press: Oxford University Press Oxford), 35-110
[25] Nute, D., (Topics in Conditional Logic (1980), Reidel: Reidel Dordrecht) · Zbl 0453.03016
[26] Paris, J. B.; Vencovská, A., A note on the inevitability of maximum entropy, Internat. J. Approximate Reasoning, 14, 183-223 (1990) · Zbl 0697.68089
[27] J.B. Paris and A. Vencovská, In defence of the maximum entropy inference process. Internat. J. Approximate Reasoning, to appear.; J.B. Paris and A. Vencovská, In defence of the maximum entropy inference process. Internat. J. Approximate Reasoning, to appear.
[28] Pearl, J., (Probabilistic Reasoning in Intelligent Systems (1988), Morgan Kaufmann: Morgan Kaufmann San Mateo, CA)
[29] Pearl, J., Bayesian and belief-functions formalisms for evidential reasoning: A conceptual analysis, (Ras, Z. W.; Zemankova, M., Intelligent Systems—State of the Art and Future Directions (1990), Ellis Horwood: Ellis Horwood Chichester), 73-117
[30] Rodder, W.; Kern-Isberner, G., Representation and extraction of information by probabilistic logic, Inform. Systems, 21, 8, 637-652 (1997) · Zbl 0869.68100
[31] Rodder, W.; Meyer, C.-H., Coherent knowledge processing at maximum entropy by spirit, (Horvitz, E.; Jensen, F., Proceedings 12th Conference on Uncertainty in Artificial Intelligence. Proceedings 12th Conference on Uncertainty in Artificial Intelligence, Portland, OR (1996), Morgan Kaufmann: Morgan Kaufmann San Francisco, CA), 470-476
[32] Shore, J. E., Relative entropy, probabilistic inference and AI, (Kanal, L. N.; Lemmer, J. F., Uncertainty in Artificial Intelligence (1986), North-Holland: North-Holland Amsterdam), 211-215
[33] Shore, J. E.; Johnson, R. W., Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy, IEEE Trans. Inform. Theory, 26, 26-37 (1980) · Zbl 0429.94011
[34] Shore, J. E.; Johnson, R. W., Properties of cross-entropy minimization, IEEE Trans. Inform. Theory, 27, 472-482 (1981) · Zbl 0459.94008
[35] Spies, M., Combination of evidence with conditional objects and its application to cognitive modeling, (Goodman, I. R.; Gupta, M. M.; Nguyen, H. T.; Rogers, C. S., Conditional Logic in Expert Systems (1991), North-Holland: North-Holland Amsterdam), 181-209
[36] Whittaker, J., (Graphical Models in Applied Multivariate Statistics (1990), John Wiley & Sons: John Wiley & Sons New York) · Zbl 0732.62056
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.