no code implementations • 20 Jun 2023 • Sidi Lu, Wenbo Zhao, Chenyang Tao, Arpit Gupta, Shanchan Wu, Tagyoung Chung, Nanyun Peng
NeurAlly-Decomposed Oracle (NADO) is a powerful approach for controllable generation with large language models.
no code implementations • 28 Apr 2020 • Shanchan Wu, Kai Fan
One transition is basically parameterized by a non-linear transformation between hidden layers that implicitly represents the conversion between the true and noisy labels, and it can be readily optimized together with other model parameters.
6 code implementations • 20 May 2019 • Shanchan Wu, Yifan He
In this paper, we propose a model that both leverages the pre-trained BERT language model and incorporates information from the target entities to tackle the relation classification task.
Ranked #15 on Relation Extraction on SemEval-2010 Task-8
no code implementations • 14 Nov 2018 • Shanchan Wu, Kai Fan, Qiong Zhang
Distant supervised relation extraction has been successfully applied to large corpus with thousands of relations.