Maximum Entropy as a Feasible Way to Describe Joint Distribution in Expert Systems

Thongchai Dumrongpokaphan, Vladik Kreinovich, Hung T. Nguyen

Authors

  • Support Team

Abstract

In expert systems, we elicit the probabilities of different statements from the experts. However, to adequately use the expert system, we also need to know the probabilities of different propositional combinations of the experts' statements -- i.e., we need to know the corresponding joint distribution. The problem is that there are exponentially many such combinations, and it is not practically possible to elicit all their probabilities from the experts. So, we need to estimate this joint distribution based on the available information. For this purpose, many practitioners use heuristic approaches -- e.g., the t-norm approach of fuzzy logic. However, this is a particular case of a situation for which the maximum entropy approach has been invented, so why not use the maximum entropy approach? The problem is that in this case, the usual formulation of the maximum entropy approach requires maximizing a function with exponentially many unknowns -- a task which is, in general, not practically feasible. In this paper, we show that in many reasonable example, the corresponding maximum entropy problem can be reduced to an equivalent problem with a much smaller (and feasible) number of unknowns -- a problem which is, therefore, much easier to solve.

Downloads

Published

2017-10-30

How to Cite

Team, S. (2017). Maximum Entropy as a Feasible Way to Describe Joint Distribution in Expert Systems: Thongchai Dumrongpokaphan, Vladik Kreinovich, Hung T. Nguyen. Thai Journal of Mathematics, 35–44. Retrieved from https://thaijmath2.in.cmu.ac.th/index.php/thaijmath/article/view/643