Unsupervised Cross-lingual Representation Learning at Scale
2020pp. 8440–8451
Citations Over TimeTop 1% of 2020 papers
Alexis Conneau, Kartikay Khandelwal, Naman Goyal, Vishrav Chaudhary, Guillaume Wenzek, Francisco Guzmán, Édouard Grave, Myle Ott, Luke Zettlemoyer, Veselin Stoyanov
Abstract
Alexis Conneau, Kartikay Khandelwal, Naman Goyal, Vishrav Chaudhary, Guillaume Wenzek, Francisco Guzmán, Edouard Grave, Myle Ott, Luke Zettlemoyer, Veselin Stoyanov. Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. 2020.
Related Papers
- → HISTORIAE, History of Socio-Cultural Transformation as Linguistic Data Science. A Humanities Use Case(2019)17,204 cited
- → DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter(2019)4,569 cited
- → Beto, Bentz, Becas: The Surprising Cross-Lingual Effectiveness of BERT(2019)585 cited
- → ALBERT: A Lite BERT for Self-supervised Learning of Language\n Representations(2019)4,061 cited