Meta-Learning for Natural Language Understanding under Continual Learning Framework

3 Nov 2020  ·  Jiacheng Wang, Yong Fan, Duo Jiang, Shiqing Li ·

Neural network has been recognized with its accomplishments on tackling various natural language understanding (NLU) tasks. Methods have been developed to train a robust model to handle multiple tasks to gain a general representation of text. In this paper, we implement the model-agnostic meta-learning (MAML) and Online aware Meta-learning (OML) meta-objective under the continual framework for NLU tasks. We validate our methods on selected SuperGLUE and GLUE benchmark.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here