BERT Rediscovers the Classical NLP Pipeline
2019pp. 4593–4601
Citations Over TimeTop 1% of 2019 papers
Abstract
Pre-trained text encoders have rapidly advanced the state of the art on many NLP tasks. We focus on one such model, BERT, and aim to quantify where linguistic information is captured within the network. We find that the model represents the steps of the traditional NLP pipeline in an interpretable and localizable way, and that the regions responsible for each step appear in the expected sequence: POS tagging, parsing, NER, semantic roles, then coreference. Qualitative analysis reveals that the model can and often does adjust this pipeline dynamically, revising lowerlevel decisions on the basis of disambiguating information from higher-level representations.
Related Papers
- → Constrained Multi-Task Learning for Event Coreference Resolution(2021)20 cited
- End-to-end coreference resolution for clinical narratives(2013)
- → Improving Coreference Resolution Using Bridging Reference Resolution and Automatically Acquired Synonyms(2007)6 cited
- → COREA: Coreference Resolution for Extracting Answers for Dutch(2012)1 cited
- → DialogRE^C+: An Extension of DialogRE to Investigate How Much Coreference Helps Relation Extraction in Dialogs(2023)