Memory-enhanced Decoder for Neural Machine Translation
Citations Over TimeTop 1% of 2016 papers
Abstract
We propose to enhance the RNN decoder in a neural machine translator (NMT) with external memory, as a natural but powerful extension to the state in the decoding RNN.This memory-enhanced RNN decoder is called MEMDEC.At each time during decoding, MEMDEC will read from this memory and write to this memory once, both with content-based addressing.Unlike the unbounded memory in previous work (Bahdanau et al., 2014) to store the representation of source sentence, the memory in MEMDEC is a matrix with predetermined size designed to better capture the information important for the decoding process at each time step.Our empirical study on Chinese-English translation shows that it can improve by 4.8 BLEU upon Groundhog and 5.3 BLEU upon on Moses, yielding the best performance achieved with the same training set.
Related Papers
- → Gated Feedback Recurrent Neural Networks(2015)416 cited
- Approximating Stacked and Bidirectional Recurrent Architectures with the Delayed Recurrent Neural Network(2019)
- → A Study on Performance Improvement of Recurrent Neural Networks Algorithm using Word Group Expansion Technique(2022)