23 march, Georgi Potapov

0 views
Skip to first unread message

Alexander Shen

unread,
1:18 PM (7 hours ago) 1:18 PM
to Kolmogorov seminar on complexity
If we have some experimental data, finite object x, and some statistical
hypothesis saying that x is a result of a random experiment that has
some distribution P, then the deficiency of x against P (`how unlikely
is x assuming P') can (and usually is) defined as -log P(x) - K(x|P). If
instead of one hypothesis P we have a class of hypotheses (e.g.,
`distribution is Bernoulli with some unknown p', or `distribution is
Poisson with some unknown \lambda'), it is natural to choose p or
\lambda that make the deficiency minimal. Can we get an explicit formula
for that minimum? The talk will discuss this question for Poisson
distribution

Reply all
Reply to author
Forward
0 new messages