[ContinualAI Seminars] “Always Be Dreaming: A New Approach for Data-Free Class-Incremental Learning”

Skip to first unread message

Vincenzo Lomonaco

Oct 5, 2021, 1:00:04 PM10/5/21
to Continual Learning & AI News
Hi All,

This Thursday 7-10-2021, 5.30pm CEST, for the ContinualAI Seminars, James Smith will present the paper:

Abstract: Modern computer vision applications suffer from catastrophic forgetting when incrementally learning new concepts over time. The most successful approaches to alleviate this forgetting require extensive replay of previously seen data, which is problematic when memory constraints or data legality concerns exist. In this work, we consider the high-impact problem of Data-Free Class-Incremental Learning (DFCIL), where an incremental learning agent must learn new concepts over time without storing generators or training data from past tasks. One approach for DFCIL is to replay synthetic images produced by inverting a frozen copy of the learner's classification model, but we show this approach fails for common class-incremental benchmarks when using standard distillation strategies. We diagnose the cause of this failure and propose a novel incremental distillation strategy for DFCIL, contributing a modified cross-entropy training and importance-weighted feature distillation, and show that our method results in up to a 25.1% increase in final task accuracy (absolute difference) compared to SOTA DFCIL methods for common class-incremental benchmarks. Our method even outperforms several standard replay based methods which store a coreset of images

The event will be moderated by: Andrea Cossu

- Microsoft Teams: click here to join 

Feel free to share this email to anyone interested and invite them to subscribe this mailing-list here: https://groups.google.com/g/continualai 

Please also contact me if you want to speak at one of the next sessions!

Looking forward to seeing you all there!

All the best,
Vincenzo Lomonaco
ContinualAI President
Reply all
Reply to author
0 new messages