Alexander Shen
unread,1:18 PM (7 hours ago) 1:18 PMSign in to reply to author
Sign in to forward
You do not have permission to delete messages in this group
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to Kolmogorov seminar on complexity
If we have some experimental data, finite object x, and some statistical
hypothesis saying that x is a result of a random experiment that has
some distribution P, then the deficiency of x against P (`how unlikely
is x assuming P') can (and usually is) defined as -log P(x) - K(x|P). If
instead of one hypothesis P we have a class of hypotheses (e.g.,
`distribution is Bernoulli with some unknown p', or `distribution is
Poisson with some unknown \lambda'), it is natural to choose p or
\lambda that make the deficiency minimal. Can we get an explicit formula
for that minimum? The talk will discuss this question for Poisson
distribution