Survey of Low-Resource Machine Translation
Computational Linguistics2022Vol. 48(3), pp. 673–732
Citations Over TimeTop 1% of 2022 papers
Abstract
Abstract We present a survey covering the state of the art in low-resource machine translation (MT) research. There are currently around 7,000 languages spoken in the world and almost all language pairs lack significant resources for training machine translation models. There has been increasing interest in research addressing the challenge of producing useful translation models when very little translated training data is available. We present a summary of this topical research field and provide a description of the techniques evaluated by researchers in several recent shared tasks in low-resource MT.
Related Papers
- → Improving Statistical Machine Translation with Word Class Models(2013)41 cited
- → Towards State-of-the-art English-Vietnamese Neural Machine Translation(2017)8 cited
- → Machine Translation Using Deep Learning: A Comparison(2020)4 cited
- → English-Japanese Neural Machine Translation with Encoder-Decoder-Reconstructor(2017)2 cited
- → Recurrent Stacking of Layers for Compact Neural Machine Translation Models(2018)2 cited