Sorry, you need to enable JavaScript to visit this website.

Dialog Context Language Modeling with Recurrent Neural Networks

Citation Author(s):
Bing Liu, Ian Lane
Submitted by:
Bing Liu
Last updated:
9 March 2017 - 4:59pm
Document Type:
Poster
Document Year:
2017
Event:
Presenters:
Bing Liu
Paper Code:
HLT-P1.3
 

We propose contextual language models that incorporate dialog level discourse information into language modeling. Previous works on contextual language model treat preceding utterances as a sequence of inputs, without considering dialog interactions. We design recurrent neural network (RNN) based contextual language models that specially track the interactions between speakers in a dialog. Experiment results on Switchboard Dialog Act Corpus show that the proposed model outperforms conventional single turn based RNN language model by 3.3% on perplexity. The proposed models also demonstrate advantageous performance over other competitive contextual language models.

up
0 users have voted: