Daily TMLR digest for Jul 25, 2022

0 views
Skip to first unread message

TMLR

unread,
Jul 24, 2022, 8:00:06 PM7/24/22
to tmlr-anno...@googlegroups.com

New submissions
===============


Title: Teacher’s pet: understanding and mitigating biases in distillation

Abstract: Knowledge distillation is widely used as a means of improving the performance of a relatively simple ``student'' model using the predictions from a complex ``teacher'' model. Several works have shown that distillation significantly boosts the student's \emph{overall} performance; however, are these gains uniform across all data subgroups? In this paper, we show that distillation can \emph{harm} performance on certain subgroups, {e.g., classes with few associated samples}, compared to the vanilla student trained using the one-hot labels. We trace this behaviour to errors made by the teacher distribution being transferred to and \emph{amplified} by the student model, and formally prove that distillation can indeed harm underrepresented subgroups in certain regression settings. To mitigate this problem, we present techniques which soften the teacher influence for subgroups where it is less reliable. Experiments on several image classification benchmarks show that these modifications of distillation maintain boost in overall accuracy, while additionally ensuring improvement in subgroup performance.

URL: https://openreview.net/forum?id=ph3AYXpwEb

---

Reply all
Reply to author
Forward
0 new messages