Sorry, you need to enable JavaScript to visit this website.

Language Modeling, for Speech and SLP (SLP-LANG)

NEURAL NETWORK LANGUAGE MODELING WITH LETTER-BASED FEATURES AND IMPORTANCE SAMPLING


In this paper we describe an extension of the Kaldi software toolkit to support neural-based language modeling, intended for use in automatic speech recognition (ASR) and related tasks. We combine the use of subword features (letter ngrams) and one-hot encoding of frequent words so that the models can handle large vocabularies containing infrequent words. We propose a new objective function that allows for training of unnormalized probabilities. An importance sampling based method is supported to speed up training when the vocabulary is large.

Paper Details

Authors:
Hainan Xu, Ke Li, Yiming Wang, Jian Wang, Shiyin Kang, Xie Chen, Daniel Povey, Sanjeev Khudanpur
Submitted On:
19 April 2018 - 11:52pm
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

kaldi-rnnlm-poster.pdf

(48 downloads)

Subscribe

[1] Hainan Xu, Ke Li, Yiming Wang, Jian Wang, Shiyin Kang, Xie Chen, Daniel Povey, Sanjeev Khudanpur, "NEURAL NETWORK LANGUAGE MODELING WITH LETTER-BASED FEATURES AND IMPORTANCE SAMPLING", IEEE SigPort, 2018. [Online]. Available: http://sigport.org/3064. Accessed: Jul. 19, 2018.
@article{3064-18,
url = {http://sigport.org/3064},
author = {Hainan Xu; Ke Li; Yiming Wang; Jian Wang; Shiyin Kang; Xie Chen; Daniel Povey; Sanjeev Khudanpur },
publisher = {IEEE SigPort},
title = {NEURAL NETWORK LANGUAGE MODELING WITH LETTER-BASED FEATURES AND IMPORTANCE SAMPLING},
year = {2018} }
TY - EJOUR
T1 - NEURAL NETWORK LANGUAGE MODELING WITH LETTER-BASED FEATURES AND IMPORTANCE SAMPLING
AU - Hainan Xu; Ke Li; Yiming Wang; Jian Wang; Shiyin Kang; Xie Chen; Daniel Povey; Sanjeev Khudanpur
PY - 2018
PB - IEEE SigPort
UR - http://sigport.org/3064
ER -
Hainan Xu, Ke Li, Yiming Wang, Jian Wang, Shiyin Kang, Xie Chen, Daniel Povey, Sanjeev Khudanpur. (2018). NEURAL NETWORK LANGUAGE MODELING WITH LETTER-BASED FEATURES AND IMPORTANCE SAMPLING. IEEE SigPort. http://sigport.org/3064
Hainan Xu, Ke Li, Yiming Wang, Jian Wang, Shiyin Kang, Xie Chen, Daniel Povey, Sanjeev Khudanpur, 2018. NEURAL NETWORK LANGUAGE MODELING WITH LETTER-BASED FEATURES AND IMPORTANCE SAMPLING. Available at: http://sigport.org/3064.
Hainan Xu, Ke Li, Yiming Wang, Jian Wang, Shiyin Kang, Xie Chen, Daniel Povey, Sanjeev Khudanpur. (2018). "NEURAL NETWORK LANGUAGE MODELING WITH LETTER-BASED FEATURES AND IMPORTANCE SAMPLING." Web.
1. Hainan Xu, Ke Li, Yiming Wang, Jian Wang, Shiyin Kang, Xie Chen, Daniel Povey, Sanjeev Khudanpur. NEURAL NETWORK LANGUAGE MODELING WITH LETTER-BASED FEATURES AND IMPORTANCE SAMPLING [Internet]. IEEE SigPort; 2018. Available from : http://sigport.org/3064

ENTROPY BASED PRUNING OF BACKOFF MAXENT LANGUAGE MODELS WITH CONTEXTUAL FEATURES


In this paper, we present a pruning technique for maximum en- tropy (MaxEnt) language models. It is based on computing the exact entropy loss when removing each feature from the model, and it ex- plicitly supports backoff features by replacing each removed feature with its backoff. The algorithm computes the loss on the training data, so it is not restricted to models with n-gram like features, al- lowing models with any feature, including long range skips, triggers, and contextual features such as device location.

poster.pdf

PDF icon poster.pdf (44 downloads)

Paper Details

Authors:
Tongzhou Chen, Diamantino Caseiro, Pat Rondon
Submitted On:
19 April 2018 - 2:12pm
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

poster.pdf

(44 downloads)

Subscribe

[1] Tongzhou Chen, Diamantino Caseiro, Pat Rondon, "ENTROPY BASED PRUNING OF BACKOFF MAXENT LANGUAGE MODELS WITH CONTEXTUAL FEATURES", IEEE SigPort, 2018. [Online]. Available: http://sigport.org/2762. Accessed: Jul. 19, 2018.
@article{2762-18,
url = {http://sigport.org/2762},
author = {Tongzhou Chen; Diamantino Caseiro; Pat Rondon },
publisher = {IEEE SigPort},
title = {ENTROPY BASED PRUNING OF BACKOFF MAXENT LANGUAGE MODELS WITH CONTEXTUAL FEATURES},
year = {2018} }
TY - EJOUR
T1 - ENTROPY BASED PRUNING OF BACKOFF MAXENT LANGUAGE MODELS WITH CONTEXTUAL FEATURES
AU - Tongzhou Chen; Diamantino Caseiro; Pat Rondon
PY - 2018
PB - IEEE SigPort
UR - http://sigport.org/2762
ER -
Tongzhou Chen, Diamantino Caseiro, Pat Rondon. (2018). ENTROPY BASED PRUNING OF BACKOFF MAXENT LANGUAGE MODELS WITH CONTEXTUAL FEATURES. IEEE SigPort. http://sigport.org/2762
Tongzhou Chen, Diamantino Caseiro, Pat Rondon, 2018. ENTROPY BASED PRUNING OF BACKOFF MAXENT LANGUAGE MODELS WITH CONTEXTUAL FEATURES. Available at: http://sigport.org/2762.
Tongzhou Chen, Diamantino Caseiro, Pat Rondon. (2018). "ENTROPY BASED PRUNING OF BACKOFF MAXENT LANGUAGE MODELS WITH CONTEXTUAL FEATURES." Web.
1. Tongzhou Chen, Diamantino Caseiro, Pat Rondon. ENTROPY BASED PRUNING OF BACKOFF MAXENT LANGUAGE MODELS WITH CONTEXTUAL FEATURES [Internet]. IEEE SigPort; 2018. Available from : http://sigport.org/2762

LIMITED-MEMORY BFGS OPTIMIZATION OF RECURRENT NEURAL NETWORK LANGUAGE MODELS FOR SPEECH RECOGNITION

Paper Details

Authors:
Xunying Liu, Shansong Liu, Jinze Sha, Jianwei Yu, Zhiyuan Xu, Xie Chen, Helen Meng
Submitted On:
13 April 2018 - 12:02am
Short Link:
Type:

Document Files

ICASSP2018_ShansongLIU_5th.pdf

(59 downloads)

Subscribe

[1] Xunying Liu, Shansong Liu, Jinze Sha, Jianwei Yu, Zhiyuan Xu, Xie Chen, Helen Meng, "LIMITED-MEMORY BFGS OPTIMIZATION OF RECURRENT NEURAL NETWORK LANGUAGE MODELS FOR SPEECH RECOGNITION", IEEE SigPort, 2018. [Online]. Available: http://sigport.org/2579. Accessed: Jul. 19, 2018.
@article{2579-18,
url = {http://sigport.org/2579},
author = {Xunying Liu; Shansong Liu; Jinze Sha; Jianwei Yu; Zhiyuan Xu; Xie Chen; Helen Meng },
publisher = {IEEE SigPort},
title = {LIMITED-MEMORY BFGS OPTIMIZATION OF RECURRENT NEURAL NETWORK LANGUAGE MODELS FOR SPEECH RECOGNITION},
year = {2018} }
TY - EJOUR
T1 - LIMITED-MEMORY BFGS OPTIMIZATION OF RECURRENT NEURAL NETWORK LANGUAGE MODELS FOR SPEECH RECOGNITION
AU - Xunying Liu; Shansong Liu; Jinze Sha; Jianwei Yu; Zhiyuan Xu; Xie Chen; Helen Meng
PY - 2018
PB - IEEE SigPort
UR - http://sigport.org/2579
ER -
Xunying Liu, Shansong Liu, Jinze Sha, Jianwei Yu, Zhiyuan Xu, Xie Chen, Helen Meng. (2018). LIMITED-MEMORY BFGS OPTIMIZATION OF RECURRENT NEURAL NETWORK LANGUAGE MODELS FOR SPEECH RECOGNITION. IEEE SigPort. http://sigport.org/2579
Xunying Liu, Shansong Liu, Jinze Sha, Jianwei Yu, Zhiyuan Xu, Xie Chen, Helen Meng, 2018. LIMITED-MEMORY BFGS OPTIMIZATION OF RECURRENT NEURAL NETWORK LANGUAGE MODELS FOR SPEECH RECOGNITION. Available at: http://sigport.org/2579.
Xunying Liu, Shansong Liu, Jinze Sha, Jianwei Yu, Zhiyuan Xu, Xie Chen, Helen Meng. (2018). "LIMITED-MEMORY BFGS OPTIMIZATION OF RECURRENT NEURAL NETWORK LANGUAGE MODELS FOR SPEECH RECOGNITION." Web.
1. Xunying Liu, Shansong Liu, Jinze Sha, Jianwei Yu, Zhiyuan Xu, Xie Chen, Helen Meng. LIMITED-MEMORY BFGS OPTIMIZATION OF RECURRENT NEURAL NETWORK LANGUAGE MODELS FOR SPEECH RECOGNITION [Internet]. IEEE SigPort; 2018. Available from : http://sigport.org/2579

Dialog Context Language Modeling with Recurrent Neural Networks


We propose contextual language models that incorporate dialog level discourse information into language modeling. Previous works on contextual language model treat preceding utterances as a sequence of inputs, without considering dialog interactions. We design recurrent neural network (RNN) based contextual language models that specially track the interactions between speakers in a dialog. Experiment results on Switchboard Dialog Act Corpus show that the proposed model outperforms conventional single turn based RNN language model by 3.3% on perplexity.

Paper Details

Authors:
Bing Liu, Ian Lane
Submitted On:
9 March 2017 - 4:59pm
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

Dialog Context Language Modeling with Recurrent Neural Networks

(163 downloads)

Subscribe

[1] Bing Liu, Ian Lane, "Dialog Context Language Modeling with Recurrent Neural Networks", IEEE SigPort, 2017. [Online]. Available: http://sigport.org/1730. Accessed: Jul. 19, 2018.
@article{1730-17,
url = {http://sigport.org/1730},
author = {Bing Liu; Ian Lane },
publisher = {IEEE SigPort},
title = {Dialog Context Language Modeling with Recurrent Neural Networks},
year = {2017} }
TY - EJOUR
T1 - Dialog Context Language Modeling with Recurrent Neural Networks
AU - Bing Liu; Ian Lane
PY - 2017
PB - IEEE SigPort
UR - http://sigport.org/1730
ER -
Bing Liu, Ian Lane. (2017). Dialog Context Language Modeling with Recurrent Neural Networks. IEEE SigPort. http://sigport.org/1730
Bing Liu, Ian Lane, 2017. Dialog Context Language Modeling with Recurrent Neural Networks. Available at: http://sigport.org/1730.
Bing Liu, Ian Lane. (2017). "Dialog Context Language Modeling with Recurrent Neural Networks." Web.
1. Bing Liu, Ian Lane. Dialog Context Language Modeling with Recurrent Neural Networks [Internet]. IEEE SigPort; 2017. Available from : http://sigport.org/1730

CHARACTER-LEVEL LANGUAGE MODELING WITH HIERARCHICAL RECURRENT NEURAL NETWORKS


Recurrent neural network (RNN) based character-level language models (CLMs) are extremely useful for modeling out-of-vocabulary words by nature. However, their performance is generally much worse than the word-level language models (WLMs), since CLMs need to consider longer history of tokens to properly predict the next one. We address this problem by proposing hierarchical RNN architectures, which consist of multiple modules with different timescales.

poster.pdf

PDF icon poster.pdf (561 downloads)

Paper Details

Authors:
Kyuyeon Hwang, Wonyong Sung
Submitted On:
6 March 2017 - 3:05am
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

poster.pdf

(561 downloads)

Subscribe

[1] Kyuyeon Hwang, Wonyong Sung, "CHARACTER-LEVEL LANGUAGE MODELING WITH HIERARCHICAL RECURRENT NEURAL NETWORKS", IEEE SigPort, 2017. [Online]. Available: http://sigport.org/1645. Accessed: Jul. 19, 2018.
@article{1645-17,
url = {http://sigport.org/1645},
author = {Kyuyeon Hwang; Wonyong Sung },
publisher = {IEEE SigPort},
title = {CHARACTER-LEVEL LANGUAGE MODELING WITH HIERARCHICAL RECURRENT NEURAL NETWORKS},
year = {2017} }
TY - EJOUR
T1 - CHARACTER-LEVEL LANGUAGE MODELING WITH HIERARCHICAL RECURRENT NEURAL NETWORKS
AU - Kyuyeon Hwang; Wonyong Sung
PY - 2017
PB - IEEE SigPort
UR - http://sigport.org/1645
ER -
Kyuyeon Hwang, Wonyong Sung. (2017). CHARACTER-LEVEL LANGUAGE MODELING WITH HIERARCHICAL RECURRENT NEURAL NETWORKS. IEEE SigPort. http://sigport.org/1645
Kyuyeon Hwang, Wonyong Sung, 2017. CHARACTER-LEVEL LANGUAGE MODELING WITH HIERARCHICAL RECURRENT NEURAL NETWORKS. Available at: http://sigport.org/1645.
Kyuyeon Hwang, Wonyong Sung. (2017). "CHARACTER-LEVEL LANGUAGE MODELING WITH HIERARCHICAL RECURRENT NEURAL NETWORKS." Web.
1. Kyuyeon Hwang, Wonyong Sung. CHARACTER-LEVEL LANGUAGE MODELING WITH HIERARCHICAL RECURRENT NEURAL NETWORKS [Internet]. IEEE SigPort; 2017. Available from : http://sigport.org/1645

Directed Automatic Speech Transcription Error Correction Using Bidirectional LSTM


In automatic speech recognition (ASR), error correction after the initial search stage is a commonly used technique to improve performance. Whilst completely automatic error correction, such as full second pass rescoring using complex language models, is widely used, directed error correction, where the error locations are manually given, is of great interest in many scenarios. Previous works on directed error correction usually uses the error location information to change search space with original ASR models.

poster.pdf

PDF icon poster.pdf (1178 downloads)

Paper Details

Authors:
Da Zheng, Zhehuai Chen, Yue Wu, Kai Yu
Submitted On:
18 October 2016 - 1:03pm
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

poster.pdf

(1178 downloads)

Subscribe

[1] Da Zheng, Zhehuai Chen, Yue Wu, Kai Yu, "Directed Automatic Speech Transcription Error Correction Using Bidirectional LSTM", IEEE SigPort, 2016. [Online]. Available: http://sigport.org/1259. Accessed: Jul. 19, 2018.
@article{1259-16,
url = {http://sigport.org/1259},
author = {Da Zheng; Zhehuai Chen; Yue Wu; Kai Yu },
publisher = {IEEE SigPort},
title = {Directed Automatic Speech Transcription Error Correction Using Bidirectional LSTM},
year = {2016} }
TY - EJOUR
T1 - Directed Automatic Speech Transcription Error Correction Using Bidirectional LSTM
AU - Da Zheng; Zhehuai Chen; Yue Wu; Kai Yu
PY - 2016
PB - IEEE SigPort
UR - http://sigport.org/1259
ER -
Da Zheng, Zhehuai Chen, Yue Wu, Kai Yu. (2016). Directed Automatic Speech Transcription Error Correction Using Bidirectional LSTM. IEEE SigPort. http://sigport.org/1259
Da Zheng, Zhehuai Chen, Yue Wu, Kai Yu, 2016. Directed Automatic Speech Transcription Error Correction Using Bidirectional LSTM. Available at: http://sigport.org/1259.
Da Zheng, Zhehuai Chen, Yue Wu, Kai Yu. (2016). "Directed Automatic Speech Transcription Error Correction Using Bidirectional LSTM." Web.
1. Da Zheng, Zhehuai Chen, Yue Wu, Kai Yu. Directed Automatic Speech Transcription Error Correction Using Bidirectional LSTM [Internet]. IEEE SigPort; 2016. Available from : http://sigport.org/1259

Exploiting noisy web data by OOV ranking for low-resource keyword search


Spoken keyword search in low-resource condition suffers from out-of-vocabulary (OOV) problem and insufficient text data for language model (LM) training. Web-crawled text data is used to expand vocabulary and to augment language model. However, the mismatching between web text and the target speech data brings difficulties to effective utilization. New words from web data need an evaluation to exclude noisy words or introduce proper probabilities. In this paper, several criteria to rank new words from web data are investigated and are used as features

Paper Details

Authors:
Ji Wu
Submitted On:
15 October 2016 - 7:55am
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

ISCSLP2016_Poster_Exploiting.pdf

(263 downloads)

Subscribe

[1] Ji Wu, "Exploiting noisy web data by OOV ranking for low-resource keyword search", IEEE SigPort, 2016. [Online]. Available: http://sigport.org/1232. Accessed: Jul. 19, 2018.
@article{1232-16,
url = {http://sigport.org/1232},
author = {Ji Wu },
publisher = {IEEE SigPort},
title = {Exploiting noisy web data by OOV ranking for low-resource keyword search},
year = {2016} }
TY - EJOUR
T1 - Exploiting noisy web data by OOV ranking for low-resource keyword search
AU - Ji Wu
PY - 2016
PB - IEEE SigPort
UR - http://sigport.org/1232
ER -
Ji Wu. (2016). Exploiting noisy web data by OOV ranking for low-resource keyword search. IEEE SigPort. http://sigport.org/1232
Ji Wu, 2016. Exploiting noisy web data by OOV ranking for low-resource keyword search. Available at: http://sigport.org/1232.
Ji Wu. (2016). "Exploiting noisy web data by OOV ranking for low-resource keyword search." Web.
1. Ji Wu. Exploiting noisy web data by OOV ranking for low-resource keyword search [Internet]. IEEE SigPort; 2016. Available from : http://sigport.org/1232

Learning FOFE based FNN-LMs with noise contrastive estimation and part-of-speech features


A simple but powerful language model called fixed-size
ordinally-forgetting encoding (FOFE) based feedforward neural
network language models (FNN-LMs) has been proposed recently.
Experimental results have shown that FOFE based FNNLMs
can outperform not only the standard FNN-LMs but also
the popular recurrent neural network language models (RNNLMs).
In this paper, we extend FOFE based FNN-LMs from
several aspects. Firstly, we have proposed a new method to
further improve the performance of FOFE based FNN-LMs by

Paper Details

Authors:
Junfeng Hou,Shiliang Zhang,Lirong Dai
Submitted On:
14 October 2016 - 4:49am
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

iscslp2016_poster_jfhou_.pdf

(260 downloads)

Subscribe

[1] Junfeng Hou,Shiliang Zhang,Lirong Dai, "Learning FOFE based FNN-LMs with noise contrastive estimation and part-of-speech features", IEEE SigPort, 2016. [Online]. Available: http://sigport.org/1185. Accessed: Jul. 19, 2018.
@article{1185-16,
url = {http://sigport.org/1185},
author = {Junfeng Hou;Shiliang Zhang;Lirong Dai },
publisher = {IEEE SigPort},
title = {Learning FOFE based FNN-LMs with noise contrastive estimation and part-of-speech features},
year = {2016} }
TY - EJOUR
T1 - Learning FOFE based FNN-LMs with noise contrastive estimation and part-of-speech features
AU - Junfeng Hou;Shiliang Zhang;Lirong Dai
PY - 2016
PB - IEEE SigPort
UR - http://sigport.org/1185
ER -
Junfeng Hou,Shiliang Zhang,Lirong Dai. (2016). Learning FOFE based FNN-LMs with noise contrastive estimation and part-of-speech features. IEEE SigPort. http://sigport.org/1185
Junfeng Hou,Shiliang Zhang,Lirong Dai, 2016. Learning FOFE based FNN-LMs with noise contrastive estimation and part-of-speech features. Available at: http://sigport.org/1185.
Junfeng Hou,Shiliang Zhang,Lirong Dai. (2016). "Learning FOFE based FNN-LMs with noise contrastive estimation and part-of-speech features." Web.
1. Junfeng Hou,Shiliang Zhang,Lirong Dai. Learning FOFE based FNN-LMs with noise contrastive estimation and part-of-speech features [Internet]. IEEE SigPort; 2016. Available from : http://sigport.org/1185

Language Model Adaptation for ASR of Spoken Translations Using Phrase-based Translation Models and Named Entity Models

Paper Details

Authors:
Joris Pelemans, Tom Vanallemeersch, Kris Demuynck, Lyan Verwimp, Hugo Van hamme, Patrick Wambacq
Submitted On:
4 April 2016 - 10:11am
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

mtlm2.pdf

(311 downloads)

Subscribe

[1] Joris Pelemans, Tom Vanallemeersch, Kris Demuynck, Lyan Verwimp, Hugo Van hamme, Patrick Wambacq, "Language Model Adaptation for ASR of Spoken Translations Using Phrase-based Translation Models and Named Entity Models", IEEE SigPort, 2016. [Online]. Available: http://sigport.org/1083. Accessed: Jul. 19, 2018.
@article{1083-16,
url = {http://sigport.org/1083},
author = {Joris Pelemans; Tom Vanallemeersch; Kris Demuynck; Lyan Verwimp; Hugo Van hamme; Patrick Wambacq },
publisher = {IEEE SigPort},
title = {Language Model Adaptation for ASR of Spoken Translations Using Phrase-based Translation Models and Named Entity Models},
year = {2016} }
TY - EJOUR
T1 - Language Model Adaptation for ASR of Spoken Translations Using Phrase-based Translation Models and Named Entity Models
AU - Joris Pelemans; Tom Vanallemeersch; Kris Demuynck; Lyan Verwimp; Hugo Van hamme; Patrick Wambacq
PY - 2016
PB - IEEE SigPort
UR - http://sigport.org/1083
ER -
Joris Pelemans, Tom Vanallemeersch, Kris Demuynck, Lyan Verwimp, Hugo Van hamme, Patrick Wambacq. (2016). Language Model Adaptation for ASR of Spoken Translations Using Phrase-based Translation Models and Named Entity Models. IEEE SigPort. http://sigport.org/1083
Joris Pelemans, Tom Vanallemeersch, Kris Demuynck, Lyan Verwimp, Hugo Van hamme, Patrick Wambacq, 2016. Language Model Adaptation for ASR of Spoken Translations Using Phrase-based Translation Models and Named Entity Models. Available at: http://sigport.org/1083.
Joris Pelemans, Tom Vanallemeersch, Kris Demuynck, Lyan Verwimp, Hugo Van hamme, Patrick Wambacq. (2016). "Language Model Adaptation for ASR of Spoken Translations Using Phrase-based Translation Models and Named Entity Models." Web.
1. Joris Pelemans, Tom Vanallemeersch, Kris Demuynck, Lyan Verwimp, Hugo Van hamme, Patrick Wambacq. Language Model Adaptation for ASR of Spoken Translations Using Phrase-based Translation Models and Named Entity Models [Internet]. IEEE SigPort; 2016. Available from : http://sigport.org/1083

CUED-RNNLM – An Open-Source Toolkit for Efficient Training and Evaluation of Recurrent Neural Network Language Models


In recent years, recurrent neural network language models (RNNLMs) have become increasingly popular for a range of applications including speech recognition. However, the training of RNNLMs is computationally expensive, which limits the quantity of data, and size of network, that can be used. In order to fully exploit the power of RNNLMs, efficient training implementations are required. This paper introduces an open-source toolkit, the CUED-RNNLM toolkit, which supports efficient GPU-based training of RNNLMs.

slides.pdf

PDF icon slides.pdf (348 downloads)

Paper Details

Authors:
Xie Chen, Yanmin Qian, Xunying Liu, Mark Gales, Phil Woodland
Submitted On:
1 April 2016 - 6:35am
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

slides.pdf

(348 downloads)

Subscribe

[1] Xie Chen, Yanmin Qian, Xunying Liu, Mark Gales, Phil Woodland, "CUED-RNNLM – An Open-Source Toolkit for Efficient Training and Evaluation of Recurrent Neural Network Language Models", IEEE SigPort, 2016. [Online]. Available: http://sigport.org/1065. Accessed: Jul. 19, 2018.
@article{1065-16,
url = {http://sigport.org/1065},
author = {Xie Chen; Yanmin Qian; Xunying Liu; Mark Gales; Phil Woodland },
publisher = {IEEE SigPort},
title = {CUED-RNNLM – An Open-Source Toolkit for Efficient Training and Evaluation of Recurrent Neural Network Language Models},
year = {2016} }
TY - EJOUR
T1 - CUED-RNNLM – An Open-Source Toolkit for Efficient Training and Evaluation of Recurrent Neural Network Language Models
AU - Xie Chen; Yanmin Qian; Xunying Liu; Mark Gales; Phil Woodland
PY - 2016
PB - IEEE SigPort
UR - http://sigport.org/1065
ER -
Xie Chen, Yanmin Qian, Xunying Liu, Mark Gales, Phil Woodland. (2016). CUED-RNNLM – An Open-Source Toolkit for Efficient Training and Evaluation of Recurrent Neural Network Language Models. IEEE SigPort. http://sigport.org/1065
Xie Chen, Yanmin Qian, Xunying Liu, Mark Gales, Phil Woodland, 2016. CUED-RNNLM – An Open-Source Toolkit for Efficient Training and Evaluation of Recurrent Neural Network Language Models. Available at: http://sigport.org/1065.
Xie Chen, Yanmin Qian, Xunying Liu, Mark Gales, Phil Woodland. (2016). "CUED-RNNLM – An Open-Source Toolkit for Efficient Training and Evaluation of Recurrent Neural Network Language Models." Web.
1. Xie Chen, Yanmin Qian, Xunying Liu, Mark Gales, Phil Woodland. CUED-RNNLM – An Open-Source Toolkit for Efficient Training and Evaluation of Recurrent Neural Network Language Models [Internet]. IEEE SigPort; 2016. Available from : http://sigport.org/1065

Pages