Nati Srebro talk (Wednesday Jan 25)

10 views
Skip to first unread message

Andreas Argyriou

unread,
Jan 19, 2012, 11:43:40 AM1/19/12
to Machine Learning Seminar 2011
Dear all,

We are continuing this term the machine learning seminar. It takes
place biweekly on Wednesdays at 11-12 am in room 526 at TTI Chicago
(5th floor). You may feel free to join the seminar group at
https://groups.google.com/a/ttic.edu/group/machine-learning-seminar-2011/topics?hl=en
This Wednesday Jan. 25 Nati Srebro (TTI Chicago) will give the
following talk.



Optimization, Learning and the Universality of Mirror Descent

I will discuss deep connections between Statistical Learning, Online
Learning and Optimization. I will show that there is a tight
correspondence between the sample size required for learning and the
number of local oracle accesses required for optimization, and the
same measures of "complexity" (e.g. the fat-shattering dimension or
Rademacher complexity) control both of them. Furthermore, I will show
how the Mirror Descent method, and in particular its stochastic/online
variant, is in a strong sense "universal" for online learning,
statistical learning and optimization. That is, for a general class
of convex learning/optimization problems, Mirror Descent can always
achieve a (nearly) optimal guarantee. In the context of statistical
learning, this also implies that for a broad generic class of convex
problems, learning can be done optimally (in the worst-case
agnostic-PAC sense) with a single pass over the data.

Joint work with Karthik Sridharan and Ambuj Tewari, and mostly based
on Sridharan's PhD Thesis.
Reply all
Reply to author
Forward
0 new messages