Sorry, you need to enable JavaScript to visit this website.

Federated Algorithm With Bayesian Approach: Omni-Fedge

Citation Author(s):
Sai Anuroop Kesanapalli, B N Bharath
Submitted by:
Sai Anuroop Kes...
Last updated:
22 June 2021 - 12:02am
Document Type:
Poster
Document Year:
2021
Event:
Presenters:
Sai Anuroop Kesanapalli, B N Bharath
Paper Code:
MLSP-12.5
 

In this paper, we consider the problem of Federated Learning (FL) under non-i.i.d data setting. We provide an improved estimate of the empirical loss at each node by using a weighted average of losses across nodes with a penalty term. These uneven weights to different nodes are assigned by taking a novel Bayesian approach to the problem where the problem of learning for each device/node is cast as maximizing the likelihood of a joint distribution. This joint distribution is for losses of nodes obtained by using data across devices for a given neural network of a node. We then provide a PAC learning guarantee on the objective function which reveals that the true average risk is no more than the proposed objective and the error term. We leverage this guarantee to propose an algorithm called Omni-Fedge. Using MNIST and Fashion MNIST data- sets, we show that the performance of the proposed algorithm is significantly better than existing algorithms.

up
0 users have voted: