In classic decision theory it is assumed that a decision-maker can assign precise numerical values corresponding to the true value of each consequence, as well as precise numerical probabilities for their occurrences. In attempting to address real-life problems, where uncertainty in the input data prevails, some kind of representation of imprecise information is important. Second-order distributions, probability distributions over probabilities, is one way to achieve such a representation. However, it is hard to intuitively understand statements in a multi-dimensional space and user statements must be provided more locally. But the information-theoretic interplay between joint and marginal distributions may give rise to unwanted effects on the global level. We consider this problem in a setting of second-order probability distributions and find a family of distributions that normalised over the probability simplex equals its own product of marginals. For such distributions, there is no flow of information between the joint distributions and the marginal distributions other than that trivial fact that the variables belong to the probability simplex.
Keywords. Second-order probability distribution, Dirichlet distribution, Beta distribution, Kullback-Leibler divergence, relative entropy, product of marginal distributions.
Paper Download
The paper is availabe in the following formats:
Plenary talk : Press here to get the file of the presentation.
Poster : Press here to get the file of the poster.
Authors addresses:
David Sundgren
Kungsbäcksvägen 47
SE-801 76 Gävle
Sweden
Love Ekenberg
Dept. of Computer and Systems Sciences
Stockholm University
Forum 100,
SE-164 40, Kista,
SWEDEN
Mats Danielson
Electrum 230
SE-164 40 Kista
E-mail addresses:
David Sundgren | dsn@hig.se |
Love Ekenberg | lovek@dsv.su.se |
Mats Danielson | mad@dsv.su.se |