ICML 2023 Workshop on Localized Learning Call for Papers

124 views
Skip to first unread message

Xiaoqian Wang

unread,
May 2, 2023, 3:24:49 PM5/2/23
to Women in Machine Learning
The following is sent on behalf of Dr. David Inouye:

Hi all,

I am co-organizing the ICML 2023 workshop on localized learning, broadly defined as any training method that updates model parts through non-global objectives.

We invite paper submissions on any topic related to localized learning (broadly defined) to our ICML 2023 Workshop on Localized Learning in Honolulu, Hawaii on Saturday, July 29, 2023! We have an amazing line up of speakers: Geoffrey Hinton, Irina Rish, Edouard Oyallon, Claudia Clopath, Timoleon Moraitis, Qu Yang, and Stephen Guo!

TL;DR
Paper deadline: Wednesday, May 24, 2023  (1 week after NeurIPS)
Format: 4 page papers in ICML format (or already accepted papers in original format)
Awards: Free workshop registration to best contributed papers!
Website for details: https://localized-learning.github.io 

Overview
Despite being widely used, global end-to-end learning has several key limitations. It requires centralized computation, making it feasible only on a single device or a carefully synchronized cluster. This restricts its use on unreliable or resource-constrained devices, such as commodity hardware clusters or edge computing networks. As the model size increases, synchronized training across devices will impact all types of parallelism.

Global learning also requires a large memory footprint, which is costly and limits the learning capability of single devices. Moreover, end-to-end learning updates have high latency, which may prevent their use in real-time applications such as learning on streaming video.

Finally, global backpropagation is thought to be biologically implausible, as biological synapses update in a local and asynchronous manner. To overcome these limitations, this workshop will delve into the fundamentals of localized learning, which is broadly defined as any training method that updates model parts through non-global objectives.

Topics
Relevant topics include but are not limited to:
  • Forward-forward learning
  • Greedy training
  • Decoupled or early-exit training
  • Iterative layer-wise learning
  • Asynchronous model update methods
  • Biologically plausible methods for local learning
  • Localized learning on edge devices
  • Self-learning or data-dependent functions
  • New applications of localized learning
We hope to see you in Hawaii!

Workshop Organizers
David I. Inouye, Mengye Ren, Mateusz Malinowski, Michael Eickenberg, Gao Huang, and Eugene Belilovsky

localized-learning-workshop-flyer.pdf
Reply all
Reply to author
Forward
0 new messages