The same zoom link (Chile) as a week ago:
16.30 Paris time, 18.30 Moscow time (no time change yet in Europe!)
If we have some experimental data, finite object x, and some statistical hypothesis saying that x is a result of a random experiment that has some distribution P, then the deficiency of x against P (`how unlikely is x assuming P') can (and usually is) defined as -log P(x) - K(x|P). If instead of one hypothesis P we have a class of hypotheses (e.g., `distribution is Bernoulli with some unknown p', or `distribution is Poisson with some unknown \lambda'), it is natural to choose p or \lambda that make the deficiency minimal. Can we get an explicit formula for that minimum? The talk will discuss this question for Poisson distribution.
Some additional context: consider finitely specified (say, finite rational-valued) distributions on N; one can consider (quite naturally from statistical viewpoint) "expectation bounded tests" t(x,P); here x is a natural number and P is the distribution, and the requirement is that the expected value of f(x,P) for every fixed P and P-distributed x is at most one. There is a maximal lower semicomputable function with this property, and this universal test is 2^{-\KP(x)}/P(x) (see the logarithmic version above). Then we can take a minimum over all measures from some class and get the so-called "class test" - the case for Bernoulli distributions was considered by Vovk, and in the talk (hopefully) we'll discuss the similar question also for the family of Poisson distributions. [Updated after a conversation with G.P.]