Search Results for author: Shanchan Wu

Found 4 papers, 1 papers with code

On Compositionality and Improved Training of NADO

no code implementations20 Jun 2023 Sidi Lu, Wenbo Zhao, Chenyang Tao, Arpit Gupta, Shanchan Wu, Tagyoung Chung, Nanyun Peng

NeurAlly-Decomposed Oracle (NADO) is a powerful approach for controllable generation with large language models.

A Practical Framework for Relation Extraction with Noisy Labels Based on Doubly Transitional Loss

no code implementations28 Apr 2020 Shanchan Wu, Kai Fan

One transition is basically parameterized by a non-linear transformation between hidden layers that implicitly represents the conversion between the true and noisy labels, and it can be readily optimized together with other model parameters.

Relation Relation Extraction

Enriching Pre-trained Language Model with Entity Information for Relation Classification

6 code implementations20 May 2019 Shanchan Wu, Yifan He

In this paper, we propose a model that both leverages the pre-trained BERT language model and incorporates information from the target entities to tackle the relation classification task.

General Classification Language Modelling +3

Cannot find the paper you are looking for? You can Submit a new open access paper.