Transformer based Natural Language Generation for Question-Answering
2020
Citations Over TimeTop 19% of 2020 papers
Abstract
This paper explores Natural Language Generation within the context of Question-Answering task. The several works addressing this task only focused on generating a short answer or a long text span that contains the answer, while reasoning over a Web page or processing structured data. Such answers’ length are usually not appropriate as the answer tend to be perceived as too brief or too long to be read out loud by an intelligent assistant. In this work, we aim at generating a concise answer for a given question using an unsupervised approach that does not require annotated data. Tested over English and French datasets, the proposed approach shows very promising results.
Related Papers
- → A Comparative Analysis of Transformer-Based Models for Document Visual Question Answering(2023)2 cited
- → Similarity Detection of Natural-Language Questions and Answers using the VANiLLa dataset(2021)2 cited
- UMass Complex Interactive Question Answering (ciQA) 2007: Human Performance as Question Answerers(2007)
- → Exploring Neural Net Augmentation to BERT for Question Answering on SQUAD 2.0(2019)1 cited
- UMass Complex Interactive Question Answering (ciQA) 2007: Human Performance as Question Answerers (Notebook Version)(2007)