Conference Papers Year : 2024

FedDec: Peer-to-peer Aided Federated Learning

Marina Costantini
  • Function : Author
  • PersonId : 1435090
Giovanni Neglia

Abstract

Federated learning (FL) has enabled training machine learning models that exploit the data of multiple agents without compromising privacy. However, FL is known to be vulnerable to data heterogeneity, partial device participation, and infrequent communication with the server, which are nonetheless distinctive characteristics of this framework. While much of the literature has tackled these weaknesses using different tools, only a few works have considered inter-agent communication to improve FL's performance. In this work, we present FedDec, an algorithm that interleaves peer-to-peer communication and parameter averaging between the local gradient updates of FL. We analyze the convergence of FedDec and show that interagent communication alleviates the negative impact of infrequent communication rounds with the server by reducing the dependence on the number of local updates H from O(H^2) to O(H). Furthermore, our analysis reveals that the term improved in the bound vanishes quickly the more connected the network is. We confirm the predictions of our theory in numerical simulations, where we show that FedDec converges faster than FedAvg, and that the gains are greater as either H or the connectivity of the network increase.
Fichier principal
Vignette du fichier
FedDec_SPAWC (1).pdf (436.23 Ko) Télécharger le fichier
Origin Files produced by the author(s)

Dates and versions

hal-04762825 , version 1 (31-10-2024)

Licence

Identifiers

Cite

Marina Costantini, Giovanni Neglia, Thrasyvoulos Spyropoulos. FedDec: Peer-to-peer Aided Federated Learning. SPAWC 2024 - IEEE 25th International Workshop on Signal Processing Advances in Wireless Communications, Sep 2024, Lucca, Italy. pp.426-430, ⟨10.1109/SPAWC60668.2024.10694344⟩. ⟨hal-04762825⟩
32 View
15 Download

Altmetric

Share

More