Weakly Supervised Learning of Semantic Parsers for Mapping Instructions to Actions
Citations Over TimeTop 1% of 2013 papers
Abstract
The context in which language is used provides a strong signal for learning to recover its meaning. In this paper, we show it can be used within a grounded CCG semantic parsing approach that learns a joint model of meaning and context for interpreting and executing natural language instructions, using various types of weak supervision. The joint nature provides crucial benefits by allowing situated cues, such as the set of visible objects, to directly influence learning. It also enables algorithms that learn while executing instructions, for example by trying to replicate human actions. Experiments on a benchmark navigational dataset demonstrate strong performance under differing forms of supervision, including correctly executing 60% more instruction sets relative to the previous state of the art.
Related Papers
- → Systematic Processing of Long Sentences in Rule Based Portuguese-Chinese Machine Translation(2010)9 cited
- → A Tree Kernel-Based Shallow Semantic Parser for Thematic Role Extraction(2007)2 cited
- → A Framework for Language Resource Construction and Syntactic Analysis: Case of Arabic(2018)1 cited
- → Morphological and Syntactic Processing for Text Retrieval(2004)8 cited
- Syntactic Parsing based on Phrase Structure in Natural Language Processing(2009)