This is a reminder that the next TCS+ talk is taking place this week, Wednesday, September 25th at 1:00 PM Eastern Time (10:00 AM Pacific Time, 19:00 Central European Time, 17:00 UTC). If you’d like to join the Zoom talk, please sign up using the form at https://sites.google.com/view/tcsplus/welcome/next-tcs-talk. The talk will also be recorded and posted shortly afterwards on our YouTube channel, here: http://www.youtube.com/user/TCSplusSeminars.
Hoping to see you all there!
The organizers
-------------------------------
Speaker: Adam Klivans (UT Austin)
Title: Efficient Algorithms for Learning with Distribution Shift
Abstract: We revisit the fundamental problem of learning with distribution shift, where a learner is given labeled samples from training distribution D, unlabeled samples from test distribution D′ and is asked to output a classifier with low test error. The standard approach in this setting is to prove a generalization bound in terms of some notion of distance between D and D′. These distances, however, are difficult to compute, and this has been the main stumbling block for algorithm design.
We sidestep this issue and define a new model called TDS learning, where a learner runs a test on the training set and is allowed to reject if this test detects distribution shift relative to a fixed output classifier. Moreover, when the test accepts, the output classifier is guaranteed to have low test error. We will describe how this approach leads to a rich set of efficient algorithms for learning well-studied function classes without taking any assumptions on the test distribution. Our techniques touch on a wide array of topics including pseudorandomness, property testing, and sum of squares proofs.
Joint work with Konstantinos Stavropoulos and Arsen Vasilyan