Fast and Differentially Private Algorithms for Decentralized Collaborative Machine Learning - Inria EPFL Access content directly
Reports (Research Report) Year : 2017

Fast and Differentially Private Algorithms for Decentralized Collaborative Machine Learning

Abstract

Consider a set of agents in a peer-to-peer communication network, where each agent has a personal dataset and a personal learning objective. The main question addressed in this paper is: how can agents collaborate to improve upon their locally learned model without leaking sensitive information about their data? Our first contribution is to reformulate this problem so that it can be solved by a block coordinate descent algorithm. We obtain an efficient and fully decentralized protocol working in an asynchronous fashion. Our second contribution is to make our algorithm differentially private to protect against the disclosure of any information about personal datasets. We prove convergence rates and exhibit the trade-off between utility and privacy. Our experiments show that our approach dramatically outperforms previous work in the non-private case, and that under privacy constraints we significantly improve over purely local models.
Fichier principal
Vignette du fichier
Decentralized_Personalized_CD_Privacy.pdf (673.76 Ko) Télécharger le fichier
Origin : Files produced by the author(s)
Loading...

Dates and versions

hal-01665410 , version 1 (15-12-2017)

Identifiers

Cite

Aurélien Bellet, Rachid Guerraoui, Mahsa Taziki, Marc Tommasi. Fast and Differentially Private Algorithms for Decentralized Collaborative Machine Learning. [Research Report] INRIA Lille. 2017, pp.1-18. ⟨hal-01665410⟩
396 View
500 Download

Altmetric

Share

Gmail Facebook X LinkedIn More