Documents
Poster
Learning Cross-lingual Knowledge with Multilingual BLSTM for Emphasis Detection with Limited Training Data
- Citation Author(s):
- Submitted by:
- Yishuang Ning
- Last updated:
- 4 March 2017 - 10:26am
- Document Type:
- Poster
- Document Year:
- 2017
- Event:
- Presenters:
- Yishuang Ning
- Paper Code:
- 2781
- Categories:
- Log in to post comments
Bidirectional long short-term memory (BLSTM) recurrent neural network (RNN) has achieved state-of-the-art performance in many sequence processing problems given its capability in capturing contextual information. However, for languages with limited amount of training data, it is still difficult to obtain a high quality BLSTM model for emphasis detection, the aim of which is to recognize the emphasized speech segments from natural speech. To address this problem, in this paper, we propose a multilingual BLSTM (MTL-BLSTM) model where the hidden layers are shared across different languages while the softmax output layer is language-dependent. The MTL-BLSTM can learn cross-lingual knowledge and transfer this knowledge to both languages to improve the emphasis detection performance. Experimental results demonstrate our method can outperform the comparison methods over 2-15.6% and 2.9-15.4% on the English corpus and Mandarin corpus in terms of relative F1-measure, respectively.