Enhancing BERT Representation With Context-Aware Embedding for Aspect-Based Sentiment Analysis
Citations Over TimeTop 10% of 2020 papers
Abstract
Aspect-based sentiment analysis, which aims to predict the sentiment polarities for the given aspects or targets, is a broad-spectrum and challenging research area. Recently, pre-trained models, such as BERT, have been used in aspect-based sentiment analysis. This fine-grained task needs auxiliary information to distinguish each aspect. But the input form of BERT is only a words sequence which can not provide extra contextual information. To address this problem, we introduce a new method named GBCN which uses a gating mechanism with context-aware aspect embeddings to enhance and control the BERT representation for aspect-based sentiment analysis. Firstly, the input texts are fed into BERT and context-aware embedding layer to generate BERT representation and refined context-aware embeddings separately. These refined embeddings contain the most correlated information selected in the context. Then, we employ a gating mechanism to control the propagation of sentiment features from BERT output with context-aware embeddings. The experiments of our model obtain new state-of-the-art results on the SentiHood and SemEval-2014 datasets, achieving a test F1 of 88.0 and 92.9 respectively.
Related Papers
- → SemEval-2007 task 07(2007)155 cited
- → NEUROSENT-PDI at SemEval-2018 Task 3: Understanding Irony in Social Networks Through a Multi-Domain Sentiment Model(2018)4 cited
- All-words Word Sense Disambiguation on a Specific Domain (SemEval-2010 Task 17)(2009)
- → What is SemEval evaluating? A Systematic Analysis of Evaluation Campaigns in NLP(2021)2 cited
- → What is SemEval evaluating? A Systematic Analysis of Evaluation Campaigns in NLP(2020)