On the Convergence of Federated Averaging with Cyclic Client Participation
Federated Averaging (FedAvg) and its variants are the most popular optimization algorithms in federated learning (FL). Previous convergence analyses of FedAvg either assume full client participation or partial client participation where the clients can be uniformly sampled. However, in practical cro...
Saved in:
Main Authors: | , , , , , |
---|---|
Format: | Journal Article |
Language: | English |
Published: |
06-02-2023
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Federated Averaging (FedAvg) and its variants are the most popular
optimization algorithms in federated learning (FL). Previous convergence
analyses of FedAvg either assume full client participation or partial client
participation where the clients can be uniformly sampled. However, in practical
cross-device FL systems, only a subset of clients that satisfy local criteria
such as battery status, network connectivity, and maximum participation
frequency requirements (to ensure privacy) are available for training at a
given time. As a result, client availability follows a natural cyclic pattern.
We provide (to our knowledge) the first theoretical framework to analyze the
convergence of FedAvg with cyclic client participation with several different
client optimizers such as GD, SGD, and shuffled SGD. Our analysis discovers
that cyclic client participation can achieve a faster asymptotic convergence
rate than vanilla FedAvg with uniform client participation under suitable
conditions, providing valuable insights into the design of client sampling
protocols. |
---|---|
DOI: | 10.48550/arxiv.2302.03109 |