mT5: A Massively Multilingual Pre-trained Text-to-Text Transformer
2021pp. 483–498
Citations Over TimeTop 1% of 2021 papers
Linting Xue, Noah Constant, Adam P. Roberts, Mihir Kale, Rami Al‐Rfou, Aditya Siddhant, Aditya Barua, Colin Raffel
Abstract
Linting Xue, Noah Constant, Adam Roberts, Mihir Kale, Rami Al-Rfou, Aditya Siddhant, Aditya Barua, Colin Raffel. Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 2021.
Related Papers
- Massively parallel artificial intelligence(1991)
- → Massively parallel switch-level simulation: a feasibility study(1991)26 cited
- Massively parallel computing(1990)
- → An adaptation of the log-derivative method to massively parallel computers(1995)2 cited
- → Running Models on Massively Parallel Computers(1995)