Adapting BERT for Continual Learning of a Sequence of Aspect Sentiment Classification Tasks

NAACL 2021  ·  Zixuan Ke, Hu Xu, Bing Liu ·

This paper studies continual learning (CL) of a sequence of aspect sentiment classification (ASC) tasks. Although some CL techniques have been proposed for document sentiment classification, we are not aware of any CL work on ASC. A CL system that incrementally learns a sequence of ASC tasks should address the following two issues: (1) transfer knowledge learned from previous tasks to the new task to help it learn a better model, and (2) maintain the performance of the models for previous tasks so that they are not forgotten. This paper proposes a novel capsule network based model called B-CL to address these issues. B-CL markedly improves the ASC performance on both the new task and the old tasks via forward and backward knowledge transfer. The effectiveness of B-CL is demonstrated through extensive experiments.

PDF Abstract NAACL 2021 PDF NAACL 2021 Abstract

Datasets


Introduced in the Paper:

ASC (TIL, 19 tasks)

Used in the Paper:

20Newsgroup (10 tasks) DSC (10 tasks)
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Continual Learning 20Newsgroup (10 tasks) B-CL F1 - macro 0.9504 # 4
Continual Learning ASC (19 tasks) B-CL F1 - macro 0.8140 # 3
Continual Learning DSC (10 tasks) B-CL F1 - macro 0.7651 # 5

Methods