Search Results for author: Junyi Du

Found 8 papers, 7 papers with code

Gradient-based Editing of Memory Examples for Online Task-free Continual Learning

1 code implementation NeurIPS 2021 Xisen Jin, Arka Sadhu, Junyi Du, Xiang Ren

We explore task-free continual learning (CL), in which a model is trained to avoid catastrophic forgetting in the absence of explicit task boundaries or identities.

Continual Learning

Visually Grounded Continual Learning of Compositional Phrases

2 code implementations EMNLP 2020 Xisen Jin, Junyi Du, Arka Sadhu, Ram Nevatia, Xiang Ren

To study this human-like language acquisition ability, we present VisCOLL, a visually grounded language learning task, which simulates the continual acquisition of compositional phrases from streaming visual scenes.

Continual Learning Grounded language learning +1

Improving BERT Fine-tuning with Embedding Normalization

no code implementations10 Nov 2019 Wenxuan Zhou, Junyi Du, Xiang Ren

Large pre-trained sentence encoders like BERT start a new chapter in natural language processing.

General Classification Sentence +2

Towards Hierarchical Importance Attribution: Explaining Compositional Semantics for Neural Sequence Models

2 code implementations ICLR 2020 Xisen Jin, Zhongyu Wei, Junyi Du, xiangyang xue, Xiang Ren

Human and metrics evaluation on both LSTM models and BERT Transformer models on multiple datasets show that our algorithms outperform prior hierarchical explanation algorithms.

Semantic Composition

NERO: A Neural Rule Grounding Framework for Label-Efficient Relation Extraction

2 code implementations5 Sep 2019 Wenxuan Zhou, Hongtao Lin, Bill Yuchen Lin, Ziqi Wang, Junyi Du, Leonardo Neves, Xiang Ren

The soft matching module learns to match rules with semantically similar sentences such that raw corpora can be automatically labeled and leveraged by the RE module (in a much better coverage) as augmented supervision, in addition to the exactly matched sentences.

Relation Relation Extraction +1

Cannot find the paper you are looking for? You can Submit a new open access paper.