Enhanced Story Representation by ConceptNet for Predicting Story Endings
Citations Over TimeTop 19% of 2020 papers
Abstract
Predicting endings for narrative stories is a grand challenge for machine commonsense reasoning. The task requires ac- curate representation of the story semantics and structured logic knowledge. Pre-trained language models, such as BERT, made progress recently in this task by exploiting spurious statistical patterns in the test dataset, instead of 'understanding' the stories per se. In this paper, we propose to improve the representation of stories by first simplifying the sentences to some key concepts and second modeling the latent relation- ship between the key ideas within the story. Such enhanced sentence representation, when used with pre-trained language models, makes substantial gains in prediction accuracy on the popular Story Cloze Test without utilizing the biased validation data.
Related Papers
- → Removing Spurious Features can Hurt Accuracy and Affect Groups Disproportionately(2021)36 cited
- → On the Causes of Spurious Solutions in Electromagnetics(2002)18 cited
- → The Spurious Path Problem in Abstraction(2021)1 cited
- → Removing Spurious Features can Hurt Accuracy and Affect Groups Disproportionately(2020)3 cited
- → Receiver spurious responses—Computer improves receiver design(1966)