Documents
Poster
MEMORY VISUALIZATION FOR GATED RECURRENT NEURAL NETWORKS IN SPEECH RECOGNITION
- Citation Author(s):
- Submitted by:
- Zhiyuan Tang
- Last updated:
- 4 March 2017 - 3:11am
- Document Type:
- Poster
- Document Year:
- 2017
- Event:
- Presenters:
- Zhiyuan Tang
- Paper Code:
- 4020
- Categories:
- Log in to post comments
Recurrent neural networks (RNNs) have shown clear superiority in sequence modeling, particularly the ones with gated units, such as long short-term memory (LSTM) and gated recurrent unit (GRU). However, the dynamic properties behind the remarkable performance remain unclear in many applications, e.g., automatic speech recognition (ASR). This paper employs visualization techniques to study the behavior of LSTM and GRU when performing speech recognition tasks. Our experiments show some interesting patterns in the gated memory, and some of them have inspired simple yet effective modifications on the network structure. We report two of such modifications: (1) lazy cell update in LSTM, and (2) shortcut connections for residual learning. Both modifications lead to more comprehensible and powerful networks.