Lazily Aggregated Quantized Gradient Innovation for Communication-Efficient Federated Learning

Jun Sun, Tianyi Chen, Georgios B. Giannakis, Qinmin Yang, Zaiyue Yang

Research output: Contribution to journalArticlepeer-review

38 Scopus citations

Abstract

This paper focuses on communication-efficient federated learning problem, and develops a novel distributed quantized gradient approach, which is characterized by adaptive communications of the quantized gradients. Specifically, the federated learning builds upon the server-worker infrastructure, where the workers calculate local gradients and upload them to the server; then the server obtain the global gradient by aggregating all the local gradients and utilizes it to update the model parameter. The key idea to save communications from the worker to the server is to quantize gradients as well as skip less informative quantized gradient communications by reusing previous gradients. Quantizing and skipping result in 'lazy' worker-server communications, which justifies the term Lazily Aggregated Quantized (LAQ) gradient. Theoretically, the LAQ algorithm achieves the same linear convergence as the gradient descent in the strongly convex case, while effecting major savings in the communication in terms of transmitted bits and communication rounds. Empirically, extensive experiments using realistic data corroborate a significant communication reduction compared with state-of-the-art gradient- and stochastic gradient-based algorithms.

Original languageEnglish (US)
Pages (from-to)2031-2044
Number of pages14
JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
Volume44
Issue number4
DOIs
StatePublished - Apr 1 2022
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 1979-2012 IEEE.

Keywords

  • Federated learning
  • communication-efficient
  • gradient innovation
  • quantization

PubMed: MeSH publication types

  • Journal Article

Fingerprint

Dive into the research topics of 'Lazily Aggregated Quantized Gradient Innovation for Communication-Efficient Federated Learning'. Together they form a unique fingerprint.

Cite this