Decoding with Large-Scale Neural Language Models Improves Translation
2013pp. 1387–1392
Citations Over TimeTop 1% of 2013 papers
Abstract
We explore the application of neural language models to machine translation.We develop a new model that combines the neural probabilistic language model of Bengio et al., rectified linear units, and noise-contrastive estimation, and we incorporate it into a machine translation system both by reranking k-best lists and by direct integration into the decoder.Our large-scale, large-vocabulary experiments across four language pairs show that our neural language model improves translation quality by up to 1.1 Bleu.
Related Papers
- Better Evaluation Metrics Lead to Better Machine Translation(2011)
- → ParFDA for Instance Selection for Statistical Machine Translation(2016)7 cited
- Statistical Machine Translation with Rule based Machine Translation.(2011)
- → On Using Monolingual Corpora in Neural Machine Translation(2015)512 cited
- → Factored Statistical Machine Translation for German-English(2018)