Improved Transition-Based Parsing and Tagging with Neural Networks
Citations Over TimeTop 10% of 2015 papers
Abstract
We extend and improve upon recent work in structured training for neural network transition-based dependency parsing. We do this by experimenting with novel features, additional transition systems and by testing on a wider array of languages. In particular, we introduce set-valued features to encode the predicted morphological properties and part-ofspeech confusion sets of the words being parsed. We also investigate the use of joint parsing and partof-speech tagging in the neural paradigm. Finally, we conduct a multi-lingual evaluation that demonstrates the robustness of the overall structured neural approach, as well as the benefits of the extensions proposed in this work. Our research further demonstrates the breadth of the applicability of neural network methods to dependency parsing, as well as the ease with which new features can be added to neural parsing models.
Related Papers
- → MaltParser: A language-independent system for data-driven dependency parsing(2007)801 cited
- → Systematic Processing of Long Sentences in Rule Based Portuguese-Chinese Machine Translation(2010)9 cited
- → Cigale: A tool for interactive grammar construction and expression parsing(1986)18 cited
- → Morphological and Syntactic Processing for Text Retrieval(2004)8 cited
- → A Survey of Syntactic-Semantic Parsing Based on Constituent and Dependency Structures(2020)2 cited