Hi all,
On Monday February 17,
Time: 18:30 (MSK), 16:30 (CET) we have a talk of Sasha Kozachinskiy on
computable online learning.
Zoom:https://u-bordeaux-fr.zoom.us/j/88402787361?pwd=WktCdEhBT3pXN0pLUGg4Z3RuMlpsQT09In the online learning setting, a learner receives an input x_1 to an unknown function f from a known hypothesis class H. The learner has to guess f(x_1). Then the learner receives the real value f(x_1), and the process repeats for x_2, x_3, and so on. The minimal number of mistakes that one can achieve on a hypothesis class H is known to be equal to the so-called Littlestone dimension of H.
What if the learner has to be computable? Or if the Littlestone dimension ``effective''? We will establish relationships between different versions of online learning when we impose computability conditions. Joint with Valentino Delle Rose and Tomasz Steifer.