Neural Machine Translation with the Transformer and Multi-Source Romance Languages for the Biomedical WMT 2018 task
2018pp. 667–670
Citations Over TimeTop 10% of 2018 papers
Abstract
The Transformer architecture has become the state-of-the-art in Machine Translation. This model, which relies on attention-based mechanisms, has outperformed previous neural machine translation architectures in several tasks. In this system description paper, we report details of training neural machine translation with multi-source Romance languages with the Transformer model and in the evaluation frame of the biomedical WMT 2018 task. Using multi-source languages from the same family allows improvements of over 6 BLEU points.
Related Papers
- → Tied Transformers: Neural Machine Translation with Shared Encoder and Decoder(2019)65 cited
- → Neural Machine Translation with the Transformer and Multi-Source Romance Languages for the Biomedical WMT 2018 task(2018)15 cited
- → Searching Better Architectures for Neural Machine Translation(2020)27 cited
- → Incorporating Pre-trained Model into Neural Machine Translation(2021)2 cited
- → On The Alignment Problem In Multi-Head Attention-Based Neural Machine\n Translation(2018)