Graph-based Dependency Parsing with Bidirectional LSTM
2016pp. 2306–2315
Citations Over TimeTop 1% of 2016 papers
Abstract
In this paper, we propose a neural network model for graph-based dependency parsing which utilizes Bidirectional LSTM (BLSTM) to capture richer contextual information instead of using high-order factorization, and enable our model to use much fewer features than previous work. In addition, we propose an effective way to learn sentence segment embedding on sentence-level based on an extra forward LSTM network. Although our model uses only first-order factorization, experiments on English Peen Treebank and Chinese Penn Treebank show that our model could be competitive with previous higher-order graph-based dependency parsing models and state-of-the-art models.
Related Papers
- Easy-First Chinese POS Tagging and Dependency Parsing(2012)
- → Factors influencing dependency parsing of coordinating structure(2009)1 cited
- → Viable Dependency Parsing as Sequence Labeling(2019)1 cited
- → A Simulated Shallow Dependency Parser Based on Weighted Hierarchical Structure Learning(2008)1 cited
- → Concurrent Parsing of Constituency and Dependency(2019)