Sorry, you need to enable JavaScript to visit this website.

Vanishing long-term gradients are a major issue in training standard recurrent neural networks (RNNs), which can be alleviated by long short-term memory (LSTM) models with memory cells. However, the extra parameters associated with the memory cells mean an LSTM layer has four times as many parameters as an RNN with the same hidden vector size. This paper addresses the vanishing gradient problem using a high order RNN (HORNN) which has additional connections from multiple previous time steps.

Categories:
5 Views

Accurate prosodic phrase prediction can improve
the naturalness of speech synthesis. Predicting the prosodic
phrase can be regarded as a sequence labeling problem and
the Conditional Random Field (CRF) is typically used to
solve it. Mongolian is an agglutinative language, in which
massive words can be formed by concatenating these stems
and suffixes. This character makes it difficult to build a
Mongolian prosodic phrase predictions system, based on
CRF, that has high performance. We introduce a new

Categories:
9 Views

Pages