Sorry, you need to enable JavaScript to visit this website.

facebooktwittermailshare

A RANDOM GOSSIP BMUF PROCESS FOR NEURAL LANGUAGE MODELING

Abstract: 

Neural network language model (NNLM) is an essential component of industrial ASR systems. One important challenge of training an NNLM is to leverage between scaling the learning process and handling big data. Conventional approaches such as block momentum provides a blockwise model update filtering (BMUF) process and achieves almost linear speedups with no performance degradation for speech recognition. However, it needs to calculate the model average from all computing nodes (e.g., GPUs) and when the number of computing nodes is large, the learning suffers from the severe communication latency. As a consequence, BMUF is not suitable under restricted network conditions. In this paper, we present a decentralized BMUF process, in which the model is split into different components, each of which is updated by communicating to some randomly chosen neighbor nodes with the same component, followed by a BMUF-like
process. We apply this method to several LSTM language modeling tasks. Experimental results show that our approach achieves consistently better performance than conventional BMUF. In particular, we obtain a lower perplexity than the single-GPU baseline on the wiki-text-103 benchmark using 4 GPUs. In addition, no performance degradation is observed when scaling to 8 and 16 GPUs.

up
0 users have voted:

Paper Details

Authors:
Yiheng Huang; Jinchuan Tian; Lei Han; Guangsen Wang; Xingchen Song; Dan Su; Dong Yu
Submitted On:
12 March 2020 - 9:03am
Short Link:
Type:
Presentation Slides
Event:
Presenter's Name:
Yiheng Huang, Jinchuan Tian
Paper Code:
1097
Document Year:
2020
Cite

Document Files

gossip-BMUF

(49)

Subscribe

[1] Yiheng Huang; Jinchuan Tian; Lei Han; Guangsen Wang; Xingchen Song; Dan Su; Dong Yu, "A RANDOM GOSSIP BMUF PROCESS FOR NEURAL LANGUAGE MODELING", IEEE SigPort, 2020. [Online]. Available: http://sigport.org/4996. Accessed: Jul. 08, 2020.
@article{4996-20,
url = {http://sigport.org/4996},
author = {Yiheng Huang; Jinchuan Tian; Lei Han; Guangsen Wang; Xingchen Song; Dan Su; Dong Yu },
publisher = {IEEE SigPort},
title = {A RANDOM GOSSIP BMUF PROCESS FOR NEURAL LANGUAGE MODELING},
year = {2020} }
TY - EJOUR
T1 - A RANDOM GOSSIP BMUF PROCESS FOR NEURAL LANGUAGE MODELING
AU - Yiheng Huang; Jinchuan Tian; Lei Han; Guangsen Wang; Xingchen Song; Dan Su; Dong Yu
PY - 2020
PB - IEEE SigPort
UR - http://sigport.org/4996
ER -
Yiheng Huang; Jinchuan Tian; Lei Han; Guangsen Wang; Xingchen Song; Dan Su; Dong Yu. (2020). A RANDOM GOSSIP BMUF PROCESS FOR NEURAL LANGUAGE MODELING. IEEE SigPort. http://sigport.org/4996
Yiheng Huang; Jinchuan Tian; Lei Han; Guangsen Wang; Xingchen Song; Dan Su; Dong Yu, 2020. A RANDOM GOSSIP BMUF PROCESS FOR NEURAL LANGUAGE MODELING. Available at: http://sigport.org/4996.
Yiheng Huang; Jinchuan Tian; Lei Han; Guangsen Wang; Xingchen Song; Dan Su; Dong Yu. (2020). "A RANDOM GOSSIP BMUF PROCESS FOR NEURAL LANGUAGE MODELING." Web.
1. Yiheng Huang; Jinchuan Tian; Lei Han; Guangsen Wang; Xingchen Song; Dan Su; Dong Yu. A RANDOM GOSSIP BMUF PROCESS FOR NEURAL LANGUAGE MODELING [Internet]. IEEE SigPort; 2020. Available from : http://sigport.org/4996