ContinualAI Reading Group: "Does Continual Learning = Catastrophic Forgetting?"

17 views
Skip to first unread message

Vincenzo Lomonaco

unread,
Feb 2, 2021, 2:10:55 PM2/2/21
to Continual Learning & AI News
Dear All,

This Friday 05-02-20215.30pm CET, for the ContinualAI Reading GroupAnh Thai (Georgia Institute of Technology) will present the paper:

TitleDoes Continual Learning = Catastrophic Forgetting?

AbstractContinual learning is known for suffering from catastrophic forgetting, a phenomenon where earlier learned concepts are forgotten at the expense of more recent samples. In this work, we challenge the assumption that continual learning is inevitably associated with catastrophic forgetting by presenting a set of tasks that surprisingly do not suffer from catastrophic forgetting when learned continually. We attempt to provide an insight into the property of these tasks that make them robust to catastrophic forgetting and the potential of having a proxy representation learning task for continual classification. We further introduce a novel yet simple algorithm, YASS that outperforms state-of-the-art methods in the class-incremental categorization learning task. Finally, we present DyRT, a novel tool for tracking the dynamics of representation learning in continual models. The codebase, dataset and pre-trained models released with this article can be found at this https URL.

The event will be moderated by: Vincenzo Lomonaco.  

---------------
- Eventbrite event (to save it in you calendar and get reminders)https://www.eventbrite.com/e/does-continual-learning-catastrophic-forgetting-tickets-139703263221
- Microsoft Teamsclick here to join
- YouTube recordings of the previous sessionshttps://www.youtube.com/c/ContinualAI
---------------

Feel free to share this email to anyone interested and invite them to subscribe this mailing-list here: https://groups.google.com/g/continualai 

You can also contact me if you want to speak at one of the next sessions!

Best regards,

ContinualAI President
Vincenzo Lomonaco
Reply all
Reply to author
Forward
0 new messages