Accelerated decentralized optimization with local updates for smooth and strongly convex objectives
Abstract
In this paper, we study the problem of minimizing a sum of smooth and strongly convex functions split over the nodes of a network in a decentralized fashion. We propose the algorithm ESDACD, a decentralized accelerated algorithm that only requires local synchrony. Its rate depends on the condition number κ of the local functions as well as the network topol-ogy and delays. Under mild assumptions on the topology of the graph, ESDACD takes a time O((τmax + ∆max) κ/γ ln(−1)) to reach a precision where γ is the spectral gap of the graph, τmax the maximum communication delay and ∆max the maximum computation time. Therefore, it matches the rate of SSDA (Scaman et al., 2017), which is optimal when τmax = Ω (∆max). Applying ESDACD to quadratic local functions leads to an accelerated randomized gossip algorithm of rate O(θ gossip /n) where θ gossip is the rate of the standard randomized gossip (Boyd et al., 2006). To the best of our knowledge, it is the first asynchronous gossip algorithm with a provably improved rate of convergence of the second moment of the error. We illustrate these results with experiments in idealized settings.
Fichier principal
Accelerated_decentralized_optimization_with_local_updates.pdf (606.54 Ko)
Télécharger le fichier
Origin | Files produced by the author(s) |
---|
Loading...