Sorry, you need to enable JavaScript to visit this website.

FAST DECENTRALIZED LEARNING VIA HYBRID CONSENSUS ADMM

Citation Author(s):
Meng Ma, Athanasios N. Nikolakopoulos, Georgios B. Giannakis
Submitted by:
Meng Ma
Last updated:
13 April 2018 - 4:18pm
Document Type:
Poster
Document Year:
2018
Event:
Presenters:
Meng Ma
Paper Code:
3414
 

The present work introduces the hybrid consensus alternating direction method of multipliers (H-CADMM), a novel framework for optimization over networks which unifies existing distributed optimization approaches, including the centralized and the decentralized consensus ADMM. H-CADMM provides a flexible tool that leverages the underlying graph topology in order to achieve a desirable sweet-spot between node-to-node communication overhead and rate of convergence -- thereby alleviating known limitations of both C-CADMM and D-CADMM. A rigorous analysis of the novel method establishes linear convergence rate, and also guides the choice of parameters to optimize this rate. The novel hybrid update rules of H-CADMM lend themselves to "in-network acceleration" that is shown to effect considerable -- and essentially "free-of-charge" -- performance boost over the fully decentralized ADMM. Comprehensive numerical tests validate the analysis and showcase the potential of the method in tackling efficiently, widely useful learning tasks.

up
0 users have voted: