Continual Relation Extraction
8 papers with code • 0 benchmarks • 0 datasets
Compared with traditional relation extraction, CRE aims to help the model learn new relations while maintaining accurate classification of old ones.
Benchmarks
These leaderboards are used to track progress in Continual Relation Extraction
Most implemented papers
Curriculum-Meta Learning for Order-Robust Continual Relation Extraction
We propose a novel curriculum-meta learning method to tackle the above two challenges in continual relation extraction.
Refining Sample Embeddings with Relation Prototypes to Enhance Continual Relation Extraction
As a typical task of continual learning, continual relation extraction (CRE) aims to extract relations between entities from texts, where the samples of different relations are delivered into the model continuously.
Consistent Representation Learning for Continual Relation Extraction
Specifically, supervised contrastive learning based on a memory bank is first used to train each new task so that the model can effectively learn the relation representation.
Learning Robust Representations for Continual Relation Extraction via Adversarial Class Augmentation
In this paper, through empirical studies we argue that this assumption may not hold, and an important reason for catastrophic forgetting is that the learned representations do not have good robustness against the appearance of analogous relations in the subsequent learning process.
Enhancing Continual Relation Extraction via Classifier Decomposition
In this work, we point out that there exist two typical biases after training of this vanilla strategy: classifier bias and representation bias, which causes the previous knowledge that the model learned to be shaded.
Improving Continual Relation Extraction by Distinguishing Analogous Semantics
To address this issue, we propose a novel continual extraction model for analogous relations.
Rationale-Enhanced Language Models are Better Continual Relation Learners
Continual relation extraction (CRE) aims to solve the problem of catastrophic forgetting when learning a sequence of newly emerging relations.
Preserving Generalization of Language models in Few-shot Continual Relation Extraction
Few-shot Continual Relations Extraction (FCRE) is an emerging and dynamic area of study where models can sequentially integrate knowledge from new relations with limited labeled data while circumventing catastrophic forgetting and preserving prior knowledge from pre-trained backbones.