BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
2020pp. 7871–7880
Citations Over TimeTop 1% of 2020 papers
Mike Lewis, Yinhan Liu, Naman Goyal, Marjan Ghazvininejad, Abdelrahman Mohamed, Omer Levy, Ves Stoyanov, Luke Zettlemoyer
Abstract
Mike Lewis, Yinhan Liu, Naman Goyal, Marjan Ghazvininejad, Abdelrahman Mohamed, Omer Levy, Veselin Stoyanov, Luke Zettlemoyer. Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. 2020.
Related Papers
- PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization(2020)
- → Language Models are Few-Shot Learners(2020)3,027 cited
- → Unified Language Model Pre-training for Natural Language Understanding\n and Generation(2019)539 cited