Upcoming Events


Wed Sept 28

No Seminar, SIAM MDS22

Wed Oct 5
12 noon ET

On the Resolution of a Theoretical Question Related to the Nature of Local Training in Federated Learning

We study distributed optimization methods based on the local training (LT) paradigm - achieving improved communication efficiency by performing richer local gradient-based training on the clients before parameter averaging - which is of key importance in federated learning. Looking back at the progress of the field in the last decade, we identify 5 generations of LT methods: 1) heuristic, 2) homogeneous, 3) sublinear, 4) linear, and 5) accelerated. The 5th generation, initiated by the ProxSkip method of Mishchenko et al (2022) and its analysis, is characterized by the first theoretical confirmation that LT is a communication acceleration mechanism. In this talk, I will explain the problem, its solution, and some subsequent work generalizing, extending and improving the ProxSkip method in various ways.

References:

1. Konstantin Mishchenko, Grigory Malinovsky, Sebastian Stich and Peter Richtárik. ProxSkip: Yes! Local gradient steps provably lead to communication acceleration! Finally! Proceedings of the 39th International Conference on Machine Learning, 2022

2. Grigory Malinovsky, Kai Yi and Peter Richtárik. Variance reduced ProxSkip: Algorithm, theory and application to federated learning, arXiv:2207.04338, 2022

3. Laurent Condat and Peter Richtárik. RandProx: Primal-dual optimization algorithms with randomized proximal updates, arXiv:2207.12891, 2022

4. Abdurakhmon Sadiev, Dmitry Kovalev and Peter Richtárik. Communication acceleration of local gradient methods via an accelerated primal-dual algorithm with inexact prox, arXiv:2207.03957, 2022

Wed Oct 12
12 noon ET

TBA

TBA

Wed Oct 19
12 noon ET

TBA

TBA

TBA

Wed Oct 26
12 noon ET

TBA

TBA

TBA

Wed Nov 9
12 noon ET

TBA

TBA

TBA

Wed Nov 16
12 noon ET

TBA

TBA

TBA

Wed Nov 23
12 noon ET

No seminar, Thanksgiving

Wed Nov 30
12 noon ET

TBA

TBA

TBA

Wed Dec 7
12 noon ET

TBA

TBA

TBA

Wed Dec 14
12 noon ET

TBA

TBA

Winter break