Search Results for author: Ningyu Zhang

Found 35 papers, 19 papers with code

SentiPrompt: Sentiment Knowledge Enhanced Prompt-Tuning for Aspect-Based Sentiment Analysis

no code implementations17 Sep 2021 Chengxi Li, Feiyu Gao, Jiajun Bu, Lu Xu, Xiang Chen, Yu Gu, Zirui Shao, Qi Zheng, Ningyu Zhang, Yongpan Wang, Zhi Yu

We inject sentiment knowledge regarding aspects, opinions, and polarities into prompt and explicitly model term relations via constructing consistency and polarity judgment templates from the ground truth triplets.

Aspect-Based Sentiment Analysis Language Modelling

LightNER: A Lightweight Generative Framework with Prompt-guided Attention for Low-resource NER

no code implementations31 Aug 2021 Xiang Chen, Ningyu Zhang, Lei LI, Xin Xie, Shumin Deng, Chuanqi Tan, Fei Huang, Luo Si, Huajun Chen

Most existing NER methods rely on extensive labeled data for model training, which struggles in the low-resource scenarios with limited training data.

Few-Shot Learning Language Modelling +2

Differentiable Prompt Makes Pre-trained Language Models Better Few-shot Learners

no code implementations30 Aug 2021 Ningyu Zhang, Luoqiu Li, Xiang Chen, Shumin Deng, Zhen Bi, Chuanqi Tan, Fei Huang, Huajun Chen

Large-scale pre-trained language models have contributed significantly to natural language processing by demonstrating remarkable abilities as few-shot learners.

Language Modelling

Document-level Relation Extraction as Semantic Segmentation

2 code implementations7 Jun 2021 Ningyu Zhang, Xiang Chen, Xin Xie, Shumin Deng, Chuanqi Tan, Mosha Chen, Fei Huang, Luo Si, Huajun Chen

Specifically, we leverage an encoder module to capture the context information of entities and a U-shaped segmentation module over the image-style feature map to capture global interdependency among triples.

Document-level Relation Extraction +1

Field Embedding: A Unified Grain-Based Framework for Word Representation

no code implementations NAACL 2021 Junjie Luo, Xi Chen, Jichao Sun, Yuejia Xiang, Ningyu Zhang, Xiang Wan

Word representations empowered with additional linguistic information have been widely studied and proved to outperform traditional embeddings.

Word Embeddings

OntoED: Low-resource Event Detection with Ontology Embedding

1 code implementation ACL 2021 Shumin Deng, Ningyu Zhang, Luoqiu Li, Hui Chen, Huaixiao Tou, Mosha Chen, Fei Huang, Huajun Chen

Most of current methods to ED rely heavily on training instances, and almost ignore the correlation of event types.

Event Detection

Unsupervised Knowledge Graph Alignment by Probabilistic Reasoning and Semantic Embedding

1 code implementation12 May 2021 Zhiyuan Qi, Ziheng Zhang, Jiaoyan Chen, Xi Chen, Yuejia Xiang, Ningyu Zhang, Yefeng Zheng

Knowledge Graph (KG) alignment is to discover the mappings (i. e., equivalent entities, relations, and others) between two KGs.

Interventional Aspect-Based Sentiment Analysis

no code implementations20 Apr 2021 Zhen Bi, Ningyu Zhang, Ganqiang Ye, Haiyang Yu, Xi Chen, Huajun Chen

Recent neural-based aspect-based sentiment analysis approaches, though achieving promising improvement on benchmark datasets, have reported suffering from poor robustness when encountering confounder such as non-target aspects.

Aspect-Based Sentiment Analysis

Disentangled Contrastive Learning for Learning Robust Textual Representations

1 code implementation11 Apr 2021 Xiang Chen, Xin Xie, Zhen Bi, Hongbin Ye, Shumin Deng, Ningyu Zhang, Huajun Chen

Although the self-supervised pre-training of transformer models has resulted in the revolutionizing of natural language processing (NLP) applications and the achievement of state-of-the-art results with regard to various benchmarks, this process is still vulnerable to small and imperceptible permutations originating from legitimate inputs.

Contrastive Learning

Normal vs. Adversarial: Salience-based Analysis of Adversarial Samples for Relation Extraction

1 code implementation1 Apr 2021 Luoqiu Li, Xiang Chen, Ningyu Zhang, Shumin Deng, Xin Xie, Chuanqi Tan, Mosha Chen, Fei Huang, Huajun Chen

Recent neural-based relation extraction approaches, though achieving promising improvement on benchmark datasets, have reported their vulnerability towards adversarial attacks.

Relation Extraction

Towards Robust Textual Representations with Disentangled Contrastive Learning

no code implementations1 Jan 2021 Ningyu Zhang, Xiang Chen, Xin Xie, Shumin Deng, Yantao Jia, Zonggang Yuan, Huajun Chen

Although the self-supervised pre-training of transformer models has resulted in the revolutionizing of natural language processing (NLP) applications and the achievement of state-of-the-art results with regard to various benchmarks, this process is still vulnerable to small and imperceptible permutations originating from legitimate inputs.

Contrastive Learning

Bridging Text and Knowledge with Multi-Prototype Embedding for Few-Shot Relational Triple Extraction

no code implementations COLING 2020 Haiyang Yu, Ningyu Zhang, Shumin Deng, Hongbin Ye, Wei zhang, Huajun Chen

Current supervised relational triple extraction approaches require huge amounts of labeled data and thus suffer from poor performance in few-shot settings.

Finding Influential Instances for Distantly Supervised Relation Extraction

no code implementations17 Sep 2020 Zifeng Wang, Rui Wen, Xi Chen, Shao-Lun Huang, Ningyu Zhang, Yefeng Zheng

Distant supervision has been demonstrated to be highly beneficial to enhance relation extraction models, but it often suffers from high label noise.

Relation Extraction

The Devil is the Classifier: Investigating Long Tail Relation Classification with Decoupling Analysis

1 code implementation15 Sep 2020 Haiyang Yu, Ningyu Zhang, Shumin Deng, Zonggang Yuan, Yantao Jia, Huajun Chen

Long-tailed relation classification is a challenging problem as the head classes may dominate the training phase, thereby leading to the deterioration of the tail performance.

General Classification Relation Classification

On Robustness and Bias Analysis of BERT-based Relation Extraction

1 code implementation14 Sep 2020 Luoqiu Li, Xiang Chen, Hongbin Ye, Zhen Bi, Shumin Deng, Ningyu Zhang, Huajun Chen

Fine-tuning pre-trained models have achieved impressive performance on standard natural language processing benchmarks.

Relation Extraction

Conceptualized Representation Learning for Chinese Biomedical Text Mining

1 code implementation25 Aug 2020 Ningyu Zhang, Qianghuai Jia, Kangping Yin, Liang Dong, Feng Gao, Nengwei Hua

In this paper, we investigate how the recently introduced pre-trained language model BERT can be adapted for Chinese biomedical corpora and propose a novel conceptualized representation learning approach.

Language Modelling Representation Learning

Relation Adversarial Network for Low Resource Knowledge Graph Completion

1 code implementation8 Nov 2019 Ningyu Zhang, Shumin Deng, Zhanlin Sun, Jiaoayan Chen, Wei zhang, Huajun Chen

Specifically, the framework takes advantage of a relation discriminator to distinguish between samples from different relations, and help learn relation-invariant features more transferable from source relations to target relations.

Knowledge Graph Completion Link Prediction +2

Meta-Learning with Dynamic-Memory-Based Prototypical Network for Few-Shot Event Detection

1 code implementation25 Oct 2019 Shumin Deng, Ningyu Zhang, Jiaojian Kang, Yichi Zhang, Wei zhang, Huajun Chen

Differing from vanilla prototypical networks simply computing event prototypes by averaging, which only consume event mentions once, our model is more robust and is capable of distilling contextual information from event mentions for multiple times due to the multi-hop mechanism of DMNs.

Event Detection Event Extraction +1

Context-aware Deep Model for Entity Recommendation in Search Engine at Alibaba

no code implementations6 Sep 2019 Qianghuai Jia, Ningyu Zhang, Nengwei Hua

Entity recommendation, providing search users with an improved experience via assisting them in finding related entities for a given query, has become an indispensable feature of today's search engines.

Transfer Learning for Relation Extraction via Relation-Gated Adversarial Learning

no code implementations22 Aug 2019 Ningyu Zhang, Shumin Deng, Zhanlin Sun, Jiaoyan Chen, Wei zhang, Huajun Chen

However, the human annotation is expensive, while human-crafted patterns suffer from semantic drift and distant supervision samples are usually noisy.

Partial Domain Adaptation Relation Extraction +1

Cannot find the paper you are looking for? You can Submit a new open access paper.