Document-Level Neural Machine Translation with Hierarchical Attention Networks
2018pp. 2947–2954
Citations Over TimeTop 1% of 2018 papers
Abstract
Neural Machine Translation (NMT) can be improved by including document-level contextual information. For this purpose, we propose a hierarchical attention model to capture the context in a structured and dynamic manner. The model is integrated in the original NMT architecture as another level of abstraction, conditioning on the NMT model's own previous hidden states. Experiments show that hierarchical attention significantly improves the BLEU score over a strong NMT baseline with the state-of-the-art in context-aware methods, and that both the encoder and decoder benefit from context in complementary ways.
Related Papers
- → In search of abstraction: The varying abstraction model of categorization(2008)102 cited
- → Draw Me a Flower: Processing and Grounding Abstraction in Natural Language(2022)5 cited
- → Two Parts of Reflective abstraction: For New Problem Solving and Mathematical Concept(2020)4 cited
- → Machine Translation Using Deep Learning: A Comparison(2020)4 cited
- → Draw Me a Flower: Processing and Grounding Abstraction in Natural Language(2021)