In classic decision theory it is assumed that a decision-maker can assign precise numerical values corresponding to the true value of each consequence, as well as precise numerical probabilities for their occurrences. In attempting to address real-life problems, where uncertainty in the input data prevails, some kind of representation of imprecise information is important. Second-order distributions, probability distributions over probabilities, is one way to achieve such a representation. However, it is hard to intuitively understand statements in a multi-dimensional space and user statements must be provided more locally. But the information-theoretic interplay between joint and marginal distributions may give rise to unwanted effects on the global level. We consider this problem in a setting of second-order probability distributions and find a family of distributions that normalised over the probability simplex equals its own product of marginals. For such distributions, there is no flow of information between the joint distributions and the marginal distributions other than that trivial fact that the variables belong to the probability simplex.
Keywords. Second-order probability distribution, Dirichlet distribution, Beta distribution, Kullback-Leibler divergence, relative entropy, product of marginal distributions.
The paper is availabe in the following formats:
Plenary talk : Press here to get the file of the presentation.
Poster : Press here to get the file of the poster.
SE-801 76 Gävle
Dept. of Computer and Systems Sciences
SE-164 40, Kista,
SE-164 40 Kista