Skip to content

fzhu0628/DisSACC---Distributed-Stochastic-Approximation-with-Constant-Communication

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 

Repository files navigation

Distributed Stochastic Approximation with Constant Communication (DisSACC)

Feng Zhu, Aritra Mitra, Robert W. Heath Jr.
IEEE Asilomar Conference on Signals, Systems, and Computers, 2025


Overview

This is a pure-theory paper. We study a general distributed stochastic approximation (SA) problem with heterogeneous agents, where the goal is to find the root of the average operator across agents.

This setting captures a wide range of applications, including:

  • Federated learning
  • Multi-agent reinforcement learning (RL)
  • Variational inequalities and fixed-point problems

Key Challenges

Existing approaches typically suffer from one or more of the following:

  • ❌ Lack of linear speedup in sample complexity
  • ❌ Bias due to heterogeneity across agents
  • ❌ High communication cost

Our Approach: DisSACC

We propose DisSACC (Distributed Stochastic Approximation with Constant Communication), which is based on a simple but effective idea:

Instead of performing many noisy local updates, we aggregate samples locally to construct a refined operator, and perform fewer, higher-quality updates.

Key Idea

  • Each agent collects multiple samples per round
  • Constructs a variance-reduced operator
  • Performs one update per round
  • Server aggregates across agents
image

Main Results

DisSACC achieves:

  • Exact convergence to the global solution
  • Linear speedup in the number of agents (M-fold variance reduction)
  • No heterogeneity-induced bias
  • Near-constant communication complexity: $\tilde{O}(1)$

Why It Matters

This work shows that it is possible to simultaneously achieve:

  • statistical efficiency (sample complexity)
  • communication efficiency
  • robustness to heterogeneity

which are typically conflicting objectives in federated and multi-agent learning systems.


Paper

You can find the paper here:

👉 PDF


Citation

If you find this work useful, please cite:

@inproceedings{zhu2025dissacc,
  title={Distributed Stochastic Approximation with Constant Communication},
  author={Zhu, Feng and Mitra, Aritra and Heath, Robert W.},
  booktitle={IEEE Asilomar Conference},
  year={2025}
}

About

Published at IEEE Asilomar 2025. We study a general distributed heterogeneous stochastic approximation problem with M agents. The proposed DisSACC method converges to the desiderata with linear speedups, no heterogeneity bias, and near constant communication.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors