Sorry, you need to enable JavaScript to visit this website.

Recurrent Neural Network-based Language Models with Variation in Net Topology, Language, and Granularity

Citation Author(s):
Tzu-Hsuan Yang, Tzu-Hsuan Tseng, Chia-Ping Chen
Submitted by:
Tzu-Hsuan Yang
Last updated:
21 November 2016 - 10:21am
Document Type:
Presentation Slides
Document Year:
2016
Event:
Presenters:
Tzu-Hsuan Yang
Paper Code:
40
 

In this paper, we study language models based on recurrent neural networks on three databases in two languages. We implement basic recurrent neural networks (RNN) and refined RNNs with long short-term memory (LSTM) cells. We use the corpora of Penn Tree Bank (PTB) and AMI in English, and the Academia Sinica Balanced Corpus (ASBC) in Chinese. On ASBC, we investigate wordbased and character-based language models. For characterbased language models, we look into the cases where the inter-word space is treated or not treated as a token. In summary, we report and comment on the performance of RNN language models with different databases, network topology, language, and granularity.

up
0 users have voted: