TCS+: New season starting, with Adam Klivans (UT Austin) on Wednesday, September 25!

24 views
Skip to first unread message

Clement Canonne

unread,
Sep 12, 2024, 8:31:58 PM9/12/24
to tcsplus_...@googlegroups.com
Dear TCS+ followers,

Our next talk, and first of the season, will take place this coming Wednesday, September 25th at 1:00 PM Eastern Time (10:00 AM Pacific Time, 19:00 Central European Time, 17:00 UTC). Adam Klivans from UT Austin will speak about "Efficient Algorithms for Learning with Distribution Shift " (abstract below).

Please sign up on the online form at https://sites.google.com/view/tcsplus/welcome/next-tcs-talk if you wish to join the talk as an individual or a group. Registration is /not/ required to attend the interactive talk, and the link will be posted on the website the day prior to the talk; however, by registering in the form, you will receive a reminder, along with the link. (The link to the recording will also be posted on our website afterwards.)

Hoping to see you all there!

The organizers

-------------------------------
Speaker: Adam Klivans (UT Austin)
Title: Efficient Algorithms for Learning with Distribution Shift

Abstract: We revisit the fundamental problem of learning with distribution shift, where a learner is given labeled samples from training distribution D, unlabeled samples from test distribution D′ and is asked to output a classifier with low test error. The standard approach in this setting is to prove a generalization bound in terms of some notion of distance between D and D′. These distances, however, are difficult to compute, and this has been the main stumbling block for algorithm design.

We sidestep this issue and define a new model called TDS learning, where a learner runs a test on the training set and is allowed to reject if this test detects distribution shift relative to a fixed output classifier. Moreover, when the test accepts, the output classifier is guaranteed to have low test error. We will describe how this approach leads to a rich set of efficient algorithms for learning well-studied function classes without taking any assumptions on the test distribution. Our techniques touch on a wide array of topics including pseudorandomness, property testing, and sum of squares proofs.

Joint work with Konstantinos Stavropoulos and Arsen Vasilyan

Reply all
Reply to author
Forward
0 new messages