Bidirectional recurrent neural networks
Citations Over TimeTop 12% of 1997 papers
Abstract
In the first part of this paper, a regular recurrent neural network (RNN) is extended to a bidirectional recurrent neural network (BRNN). The BRNN can be trained without the limitation of using input information just up to a preset future frame. This is accomplished by training it simultaneously in positive and negative time direction. Structure and training procedure of the proposed network are explained. In regression and classification experiments on artificial data, the proposed structure gives better results than other approaches. For real data, classification experiments for phonemes from the TIMIT database show the same tendency. In the second part of this paper, it is shown how the proposed bidirectional structure can be easily modified to allow efficient estimation of the conditional posterior probability of complete symbol sequences without making any explicit assumption about the shape of the distribution. For this part, experiments on real data are reported.
Related Papers
- → Frame distance array algorithm parameter tune-up for TIMIT corpus automatic speech segmentation(2015)1 cited
- rre STC-TIMIT: Generation of a Single-channel Telephone Corpus.(2008)
- → UNSUPERVISED DISCOVERY OF SPEECH SEGMENTS USING RECURRENT NETWORKS(1991)11 cited
- STC-TIMIT 1.0(2008)