Search-based Neural Structured Learning for Sequential Question Answering
Citations Over TimeTop 1% of 2017 papers
Abstract
Recent work in semantic parsing for question answering has focused on long and complicated questions, many of which would seem unnatural if asked in a normal conversation between two humans. In an effort to explore a conversational QA setting, we present a more realistic task: answering sequences of simple but inter-related questions. We collect a dataset of 6,066 question sequences that inquire about semistructured tables from Wikipedia, with 17,553 question-answer pairs in total. To solve this sequential question answering task, we propose a novel dynamic neural semantic parsing framework trained using a weakly supervised reward-guided search. Our model effectively leverages the sequential context to outperform state-of-the-art QA systems that are designed to answer highly complex questions.
Related Papers
- → Systematic Processing of Long Sentences in Rule Based Portuguese-Chinese Machine Translation(2010)9 cited
- → A Hybrid Approach to Parsing Natural Languages(2016)1 cited
- → Morphological and Syntactic Processing for Text Retrieval(2004)8 cited
- Syntactic Parsing based on Phrase Structure in Natural Language Processing(2009)