Documents
Presentation Slides
Presentation Slides
Recurrent Neural Network-based Language Models with Variation in Net Topology, Language, and Granularity
- Citation Author(s):
- Submitted by:
- Tzu-Hsuan Yang
- Last updated:
- 21 November 2016 - 10:21am
- Document Type:
- Presentation Slides
- Document Year:
- 2016
- Event:
- Presenters:
- Tzu-Hsuan Yang
- Paper Code:
- 40
- Categories:
- Keywords:
- Log in to post comments
In this paper, we study language models based on recurrent neural networks on three databases in two languages. We implement basic recurrent neural networks (RNN) and refined RNNs with long short-term memory (LSTM) cells. We use the corpora of Penn Tree Bank (PTB) and AMI in English, and the Academia Sinica Balanced Corpus (ASBC) in Chinese. On ASBC, we investigate wordbased and character-based language models. For characterbased language models, we look into the cases where the inter-word space is treated or not treated as a token. In summary, we report and comment on the performance of RNN language models with different databases, network topology, language, and granularity.
40_RNN.pdf
40_RNN (604)