1 code implementation • NeurIPS 2021 • Xisen Jin, Arka Sadhu, Junyi Du, Xiang Ren
We explore task-free continual learning (CL), in which a model is trained to avoid catastrophic forgetting in the absence of explicit task boundaries or identities.
2 code implementations • EMNLP 2020 • Xisen Jin, Junyi Du, Arka Sadhu, Ram Nevatia, Xiang Ren
To study this human-like language acquisition ability, we present VisCOLL, a visually grounded language learning task, which simulates the continual acquisition of compositional phrases from streaming visual scenes.
1 code implementation • EMNLP (nlpbt) 2020 • Frank F. Xu, Lei Ji, Botian Shi, Junyi Du, Graham Neubig, Yonatan Bisk, Nan Duan
Watching instructional videos are often used to learn about procedures.
no code implementations • 10 Nov 2019 • Wenxuan Zhou, Junyi Du, Xiang Ren
Large pre-trained sentence encoders like BERT start a new chapter in natural language processing.
2 code implementations • ICLR 2020 • Xisen Jin, Zhongyu Wei, Junyi Du, xiangyang xue, Xiang Ren
Human and metrics evaluation on both LSTM models and BERT Transformer models on multiple datasets show that our algorithms outperform prior hierarchical explanation algorithms.
2 code implementations • 5 Sep 2019 • Wenxuan Zhou, Hongtao Lin, Bill Yuchen Lin, Ziqi Wang, Junyi Du, Leonardo Neves, Xiang Ren
The soft matching module learns to match rules with semantically similar sentences such that raw corpora can be automatically labeled and leveraged by the RE module (in a much better coverage) as augmented supervision, in addition to the exactly matched sentences.
1 code implementation • ACL 2019 • Junyi Du, He Jiang, Jiaming Shen, Xiang Ren
To reduce human efforts and scale the process, automated CTA transcript parsing is desirable.
2 code implementations • 26 Jun 2019 • Junyi Du, He Jiang, Jiaming Shen, Xiang Ren
To reduce human efforts and scale the process, automated CTA transcript parsing is desirable.