Search Results for author: Heyan Huang

Found 43 papers, 22 papers with code

Enlivening Redundant Heads in Multi-head Self-attention for Machine Translation

no code implementations EMNLP 2021 Tianfu Zhang, Heyan Huang, Chong Feng, Longbing Cao

Multi-head self-attention recently attracts enormous interest owing to its specialized functions, significant parallelizable computation, and flexible extensibility.

Machine Translation Translation

Boosting Event Extraction with Denoised Structure-to-Text Augmentation

no code implementations16 May 2023 Bo wang, Heyan Huang, Xiaochi Wei, Ge Shi, Xiao Liu, Chong Feng, Tong Zhou, Shuaiqiang Wang, Dawei Yin

Event extraction aims to recognize pre-defined event triggers and arguments from texts, which suffer from the lack of high-quality annotations.

Event Extraction Text Augmentation +1

Measuring Cross-Lingual Transferability of Multilingual Transformers on Sentence Classification

no code implementations15 May 2023 Zewen Chi, Heyan Huang, Xian-Ling Mao

Recent studies have exhibited remarkable capabilities of pre-trained multilingual Transformers, especially cross-lingual transferability.

Cross-Lingual Transfer Sentence Classification

AttenWalker: Unsupervised Long-Document Question Answering via Attention-based Graph Walking

1 code implementation3 May 2023 Yuxiang Nie, Heyan Huang, Wei Wei, Xian-Ling Mao

To alleviate the problem, it might be possible to generate long-document QA pairs via unsupervised question answering (UQA) methods.

Few-Shot Learning Question Answering

Momentum Decoding: Open-ended Text Generation As Graph Exploration

1 code implementation5 Dec 2022 Tian Lan, Yixuan Su, Shuhang Liu, Heyan Huang, Xian-Ling Mao

In this study, we formulate open-ended text generation from a new perspective, i. e., we view it as an exploration process within a directed graph.

Text Generation

ConsPrompt: Easily Exploiting Contrastive Samples for Few-shot Prompt Learning

no code implementations8 Nov 2022 Jinta Weng, Yue Hu, Zhihong Tian, Heyan Huang

The effectiveness of proposed ConsPrompt is demonstrated in five different few-shot learning tasks and shown the similarity-based sampling strategy is more effective than label-based in combining contrastive learning.

Contrastive Learning Few-Shot Learning

Revisiting Grammatical Error Correction Evaluation and Beyond

1 code implementation3 Nov 2022 Peiyuan Gong, Xuebo Liu, Heyan Huang, Min Zhang

Pretraining-based (PT-based) automatic evaluation metrics (e. g., BERTScore and BARTScore) have been widely used in several sentence generation tasks (e. g., machine translation and text summarization) due to their better correlation with human judgments over traditional overlap-based methods.

Grammatical Error Correction Machine Translation +1

ET5: A Novel End-to-end Framework for Conversational Machine Reading Comprehension

1 code implementation COLING 2022 Xiao Zhang, Heyan Huang, Zewen Chi, Xian-Ling Mao

Conversational machine reading comprehension (CMRC) aims to assist computers to understand an natural language text and thereafter engage in a multi-turn conversation to answer questions related to the text.

Decision Making Machine Reading Comprehension

Unsupervised Hashing with Semantic Concept Mining

1 code implementation23 Sep 2022 Rong-Cheng Tu, Xian-Ling Mao, Kevin Qinghong Lin, Chengfei Cai, Weize Qin, Hongfa Wang, Wei Wei, Heyan Huang

Recently, to improve the unsupervised image retrieval performance, plenty of unsupervised hashing methods have been proposed by designing a semantic similarity matrix, which is based on the similarities between image features extracted by a pre-trained CNN model.

Image Retrieval Prompt Engineering +4

Unsupervised Question Answering via Answer Diversifying

1 code implementation COLING 2022 Yuxiang Nie, Heyan Huang, Zewen Chi, Xian-Ling Mao

Previous works usually make use of heuristic rules as well as pre-trained models to construct data and train QA models.

Data Augmentation Denoising +3

Relational Triple Extraction: One Step is Enough

no code implementations11 May 2022 Yu-Ming Shang, Heyan Huang, Xin Sun, Wei Wei, Xian-Ling Mao

Extracting relational triples from unstructured text is an essential task in natural language processing and knowledge graph construction.

graph construction

$G^2$: Enhance Knowledge Grounded Dialogue via Ground Graph

no code implementations27 Apr 2022 Yizhe Yang, Yang Gao, Jiawei Li, Heyan Huang

Besides, a Ground Graph Aware Transformer ($G^2AT$) is proposed to enhance knowledge grounded response generation.

Response Generation

Cross-Lingual Phrase Retrieval

1 code implementation ACL 2022 Heqi Zheng, Xiao Zhang, Zewen Chi, Heyan Huang, Tan Yan, Tian Lan, Wei Wei, Xian-Ling Mao

In this paper, we propose XPR, a cross-lingual phrase retriever that extracts phrase representations from unlabeled example sentences.

Retrieval

Hammer PDF: An Intelligent PDF Reader for Scientific Papers

no code implementations6 Apr 2022 Sheng-Fu Wang, Shu-Hang Liu, Tian-Yi Che, Yi-Fan Lu, Song-Xiao Yang, Heyan Huang, Xian-Ling Mao

Specifically, taking a paper as a basic and separate unit, existing PDF Readers cannot access extended information about the paper, such as corresponding videos, blogs and codes.

Efficient Non-Autoregressive GAN Voice Conversion using VQWav2vec Features and Dynamic Convolution

1 code implementation31 Mar 2022 Mingjie Chen, Yanghao Zhou, Heyan Huang, Thomas Hain

It was shown recently that a combination of ASR and TTS models yield highly competitive performance on standard voice conversion tasks such as the Voice Conversion Challenge 2020 (VCC2020).

Voice Conversion

TCM-SD: A Benchmark for Probing Syndrome Differentiation via Natural Language Processing

1 code implementation CCL 2022 Mucheng Ren, Heyan Huang, Yuxiang Zhou, Qianwen Cao, Yuan Bu, Yang Gao

Therefore, in this paper, we focus on the core task of the TCM diagnosis and treatment system -- syndrome differentiation (SD) -- and we introduce the first public large-scale dataset for SD, called TCM-SD.

Language Modelling

OneRel:Joint Entity and Relation Extraction with One Module in One Step

no code implementations10 Mar 2022 Yu-Ming Shang, Heyan Huang, Xian-Ling Mao

Joint entity and relation extraction is an essential task in natural language processing and knowledge graph construction.

graph construction Joint Entity and Relation Extraction +1

Unifying Cross-lingual Summarization and Machine Translation with Compression Rate

1 code implementation15 Oct 2021 Yu Bai, Heyan Huang, Kai Fan, Yang Gao, Yiming Zhu, Jiaao Zhan, Zewen Chi, Boxing Chen

Through introducing compression rate, the information ratio between the source and the target text, we regard the MT task as a special CLS task with a compression rate of 100%.

Data Augmentation Machine Translation +1

Exploring Dense Retrieval for Dialogue Response Selection

1 code implementation13 Oct 2021 Tian Lan, Deng Cai, Yan Wang, Yixuan Su, Heyan Huang, Xian-Ling Mao

In this study, we present a solution to directly select proper responses from a large corpus or even a nonparallel corpus that only consists of unpaired sentences, using a dense retrieval model.

Conversational Response Selection Retrieval

Cross-Lingual Language Model Meta-Pretraining

no code implementations23 Sep 2021 Zewen Chi, Heyan Huang, Luyang Liu, Yu Bai, Xian-Ling Mao

The success of pretrained cross-lingual language models relies on two essential abilities, i. e., generalization ability for learning downstream tasks in a source language, and cross-lingual transferability for transferring the task knowledge to other languages.

Cross-Lingual Transfer Language Modelling

Prediction or Comparison: Toward Interpretable Qualitative Reasoning

no code implementations Findings (ACL) 2021 Mucheng Ren, Heyan Huang, Yang Gao

Qualitative relationships illustrate how changing one property (e. g., moving velocity) affects another (e. g., kinetic energy) and constitutes a considerable portion of textual knowledge.

Question Answering

Cross-Lingual Abstractive Summarization with Limited Parallel Resources

1 code implementation ACL 2021 Yu Bai, Yang Gao, Heyan Huang

Employing one unified decoder to generate the sequential concatenation of monolingual and cross-lingual summaries, MCLAS makes the monolingual summarization task a prerequisite of the cross-lingual summarization (CLS) task.

Abstractive Text Summarization Cross-Lingual Abstractive Summarization +1

MT6: Multilingual Pretrained Text-to-Text Transformer with Translation Pairs

1 code implementation EMNLP 2021 Zewen Chi, Li Dong, Shuming Ma, Shaohan Huang Xian-Ling Mao, Heyan Huang, Furu Wei

Multilingual T5 (mT5) pretrains a sequence-to-sequence model on massive monolingual texts, which has shown promising results on many cross-lingual tasks.

Abstractive Text Summarization Machine Translation +6

A Robust and Domain-Adaptive Approach for Low-Resource Named Entity Recognition

1 code implementation2 Jan 2021 Houjin Yu, Xian-Ling Mao, Zewen Chi, Wei Wei, Heyan Huang

Recently, it has attracted much attention to build reliable named entity recognition (NER) systems using limited annotated data.

Ranked #3 on Named Entity Recognition (NER) on SciERC (using extra training data)

Low Resource Named Entity Recognition named-entity-recognition +2

News-Driven Stock Prediction Using Noisy Equity State Representation

no code implementations1 Jan 2021 Xiao Liu, Heyan Huang, Yue Zhang

News-driven stock prediction investigates the correlation between news events and stock price movements.

Stock Prediction

Self-attention Comparison Module for Boosting Performance on Retrieval-based Open-Domain Dialog Systems

no code implementations21 Dec 2020 Tian Lan, Xian-Ling Mao, Zhipeng Zhao, Wei Wei, Heyan Huang

Since the pre-trained language models are widely used, retrieval-based open-domain dialog systems, have attracted considerable attention from researchers recently.

Open-Domain Dialog Retrieval

Ultra-Fast, Low-Storage, Highly Effective Coarse-grained Selection in Retrieval-based Chatbot by Using Deep Semantic Hashing

1 code implementation17 Dec 2020 Tian Lan, Xian-Ling Mao, Xiaoyan Gao, Wei Wei, Heyan Huang

Specifically, in our proposed DSHC model, a hashing optimizing module that consists of two autoencoder models is stacked on a trained dense representation model, and three loss functions are designed to optimize it.

Chatbot Retrieval

Deep Cross-modal Hashing via Margin-dynamic-softmax Loss

no code implementations6 Nov 2020 Rong-Cheng Tu, Xian-Ling Mao, Rongxin Tu, Binbin Bian, Wei Wei, Heyan Huang

Finally, by minimizing the novel \textit{margin-dynamic-softmax loss}, the modality-specific hashing networks can be trained to generate hash codes which can simultaneously preserve the cross-modal similarity and abundant semantic information well.

Cross-Modal Retrieval Retrieval

Deep Kernel Supervised Hashing for Node Classification in Structural Networks

no code implementations26 Oct 2020 Jia-Nan Guo, Xian-Ling Mao, Shu-Yang Lin, Wei Wei, Heyan Huang

However, nearly all the existing network embedding based methods are hard to capture the actual category features of a node because of the linearly inseparable problem in low-dimensional space; meanwhile they cannot incorporate simultaneously network structure information and node label information into network embedding.

Classification General Classification +2

STN4DST: A Scalable Dialogue State Tracking based on Slot Tagging Navigation

no code implementations21 Oct 2020 Puhai Yang, Heyan Huang, Xianling Mao

Scalability for handling unknown slot values is a important problem in dialogue state tracking (DST).

Dialogue State Tracking

Towards Interpretable Reasoning over Paragraph Effects in Situation

1 code implementation EMNLP 2020 Mucheng Ren, Xiubo Geng, Tao Qin, Heyan Huang, Daxin Jiang

We focus on the task of reasoning over paragraph effects in situation, which requires a model to understand the cause and effect described in a background paragraph, and apply the knowledge to a novel situation.

Learning Relation Ties with a Force-Directed Graph in Distant Supervised Relation Extraction

no code implementations21 Apr 2020 Yuming Shang, Heyan Huang, Xin Sun, Xian-Ling Mao

Then, we borrow the idea of Coulomb's Law from physics and introduce the concept of attractive force and repulsive force to this graph to learn correlation and mutual exclusion between relations.

Relation Extraction

Cannot find the paper you are looking for? You can Submit a new open access paper.