Dual-Free Stochastic Decentralized Optimization with Variance Reduction - Inria - Institut national de recherche en sciences et technologies du numérique Access content directly
Conference Papers Year : 2020

Dual-Free Stochastic Decentralized Optimization with Variance Reduction

Abstract

We consider the problem of training machine learning models on distributed data in a decentralized way. For finite-sum problems, fast single-machine algorithms for large datasets rely on stochastic updates combined with variance reduction. Yet, existing decentralized stochastic algorithms either do not obtain the full speedup allowed by stochastic updates, or require oracles that are more expensive than regular gradients. In this work, we introduce a Decentralized stochastic algorithm with Variance Reduction called DVR. DVR only requires computing stochastic gradients of the local functions, and is computationally as fast as a standard stochastic variance-reduced algorithms run on a 1/n fraction of the dataset, where n is the number of nodes. To derive DVR, we use Bregman coordinate descent on a well-chosen dual problem, and obtain a dual-free algorithm using a specific Bregman divergence. We give an accelerated version of DVR based on the Catalyst framework, and illustrate its effectiveness with simulations on real data.
Fichier principal
Vignette du fichier
DVR_camera_ready_supp.pdf (898.38 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-02974237 , version 1 (21-10-2020)

Identifiers

  • HAL Id : hal-02974237 , version 1

Cite

Hadrien Hendrikx, Francis Bach, Laurent Massoulié. Dual-Free Stochastic Decentralized Optimization with Variance Reduction. NeurIPS 2020 - 34th Conference on Neural Information Processing Systems, Dec 2020, Vancouver / Virtual, Canada. ⟨hal-02974237⟩
3913 View
60 Download

Share

Gmail Facebook Twitter LinkedIn More