Meeting #135: [Offline at ISTC; 16:00!] ProxSkip: Local Gradient Steps Provably Lead to Communication Acceleration

121 views
Skip to first unread message

Hrant Khachatrian

unread,
Apr 8, 2022, 10:33:46 AM4/8/22
to Machine Learning Reading Group Yerevan
Hi everyone,

This week we will have an offline meeting where Arto Maranjyan from YerevaNN will present a very recent paper on distributed versions of gradient descent. It solves an important problem of the communication speed. Here is a quote from the paper:

Our main motivation comes from federated learning, where evaluation of the gradient operator corresponds to taking a local GD step independently on all devices, and evaluation of prox corresponds to (expensive) communication in the form of gradient averaging. In this context, ProxSkip offers an effective acceleration of communication complexity. Unlike other local gradient-type methods, such as FedAvg, SCAFFOLD, S-Local-GD and FedLin, whose theoretical communication complexity is worse than, or at best matching, that of vanilla GD in the heterogeneous data regime, we obtain a provable and large improvement without any heterogeneity-bounding assumptions.

Link to the paper: https://arxiv.org/abs/2202.09357 


Please also note that there are few interesting events happening these days:
1. First PyData monthly event at AUA (April 8, 6.30pm) https://www.facebook.com/events/709713953678576/?ref=newsfeed
2. MLOps meetup at ServiceTitan office (April 14, 7pm) https://www.eventbrite.com/e/mlops-evn-meetup-tickets-314497860567  
3. A lecture on NLP by Dr. Alexander Panchenko from SkolTech (April 22, 7pm) https://www.meetup.com/nlp-in-armenia/events/285123862/

Also, Call for proposals is open for PyData Yerevan conference:  https://pydata.org/yerevan2022/present/

Best,
Hrant
Reply all
Reply to author
Forward
0 new messages