Noisy Or-based model for Relation Extraction using Distant Supervision
Citations Over TimeTop 10% of 2014 papers
Abstract
Distant supervision, a paradigm of relation extraction where training data is created by aligning facts in a database with a large unannotated corpus, is an attractive approach for training relation extractors. Various models are proposed in recent literature to align the facts in the database to their mentions in the corpus. In this paper, we discuss and critically analyse a popular alignment strategy called the “at least one” heuristic. We provide a simple, yet effective relaxation to this strategy. We formulate the inference procedures in training as integer linear programming (ILP) problems and implement the relaxation to the “at least one ” heuristic via a soft constraint in this formulation. Empirically, we demonstrate that this simple strategy leads to a better performance under certain settings over the existing approaches.
Related Papers
- → POSITIVE-NEGATIVE ASYMMETRY IN MENTAL STATE INFERENCE: REPLICATION AND EXTENSION(2006)1 cited
- → REKER: Relation Extraction with Knowledge of Entity and Relation(2019)2 cited
- → PromptRE: Weakly-Supervised Document-Level Relation Extraction via Prompting-Based Data Programming(2023)1 cited
- → An Empirical Study on Relation Extraction in the Biomedical Domain(2021)