BERT-ASC: Implicit Aspect Representation Learning through Auxiliary-Sentence Construction for Sentiment Analysis

22 Mar 2022  ·  Murtadha Ahmed, Shengfeng Pan, Jianlin Su, Xinxin Cao, Wenze Zhang, Bo Wen, Yunfeng Liu ·

Aspect-based sentiment analysis (ABSA) task aim at associating a piece of text with a set of aspects and meanwhile infer their respective sentimental polarities. The state-of-the-art approaches are built upon fine-tuning of various pre-trained language models. They commonly attempt to learn aspect-specific representation from the corpus. Unfortunately, the aspect is often expressed implicitly through a set of representatives and thus renders implicit mapping process unattainable unless sufficient labeled examples are available. However, high-quality labeled examples may not be readily available in real-world scenarios. In this paper, we propose to jointly address aspect categorization and aspect-based sentiment subtasks in a unified framework. Specifically, we first introduce a simple but effective mechanism to construct an auxiliary-sentence for the implicit aspect based on the semantic information in the corpus. Then, we encourage BERT to learn the aspect-specific representation in response to the automatically constructed auxiliary-sentence instead of the aspect itself. Finally, we empirically evaluate the performance of the proposed solution by a comparative study on real benchmark datasets for both ABSA and Targeted-ABSA tasks. Our extensive experiments show that it consistently achieves state-of-the-art performance in terms of aspect categorization and aspect-based sentiment across all datasets and the improvement margins are considerable. The code of BERT-ASC is available in GitHub: https://github.com/amurtadha/BERT-ASC.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods