Extensions of recurrent neural network language model
2011pp. 5528–5531
Citations Over TimeTop 1% of 2011 papers
Abstract
We present several modifications of the original recurrent neural network language model (RNN LM).While this model has been shown to significantly outperform many competitive language modeling techniques in terms of accuracy, the remaining problem is the computational complexity. In this work, we show approaches that lead to more than 15 times speedup for both training and testing phases. Next, we show importance of using a backpropagation through time algorithm. An empirical comparison with feedforward networks is also provided. In the end, we discuss possibilities how to reduce the amount of parameters in the model. The resulting RNN model can thus be smaller, faster both during training and testing, and more accurate than the basic one.
Related Papers
- → FIR and IIR Synapses, a New Neural Network Architecture for Time Series Modeling(1991)211 cited
- → The Evolution of a Feedforward Neural Network trained under Backpropagation(1998)5 cited
- OMBP: Optic Modified BackPropagation training algorithm for fast convergence of Feedforward Neural Network(2011)
- → Representational Capabilities of Multilayer Feedforward Networks with Time-Delay Synapses(1992)2 cited
- → Physics-Guided Neural Networks for Feedforward Control: An Orthogonal Projection-Based Approach(2022)