Hi All,
This Friday 23-04-2021, 5.30pm CEST, for the ContinualAI Reading Group, Timothée Lesort (MILA) will present the paper:
Abstract: Classical machine learning algorithms often assume that the data are drawn i.i.d. from a stationary probability distribution. Recently, continual learning emerged as a rapidly growing area of machine learning where this assumption is relaxed, namely, where the data distribution is non-stationary, i.e., changes over time. However, data distribution drifts may interfere with the learning process and erase previously learned knowledge; thus, continual learning algorithms must include specialized mechanisms to deal with such distribution drifts. A distribution drift may change the class labels distribution, the input distribution, or both. Moreover, distribution drifts might be abrupt or gradual. In this paper, we aim to identify and categorize different types of data distribution drifts and potential assumptions about them, to better characterize various continual-learning scenarios. Moreover, we propose to use the distribution drift framework to provide more precise definitions of several terms commonly used in the continual learning field.
The event will be moderated by: Vincenzo Lomonaco
---------------
- Eventbrite event (to save it in you calendar and get reminders):
Click Here ---------------
Please also contact me if you want to speak at one of the next sessions!
Looking forward to seeing you all there!
All the best,
Keiland Cooper
University of California
ContinualAI Co-founding Board Member