2 code implementations • EMNLP 2021 • Chengyu Wang, Jianing Wang, Minghui Qiu, Jun Huang, Ming Gao
Based on continuous prompt embeddings, we propose TransPrompt, a transferable prompting framework for few-shot learning across similar tasks.
no code implementations • Findings (EMNLP) 2021 • Zhi Li, Yuchen Zhai, Chengyu Wang, Minghui Qiu, Kailiang Li, Yin Zhang
Inspired by the fact that words with similar semantic can share a part of weights, we divide the embeddings of words into two parts: unique embedding and class embedding.
no code implementations • EMNLP 2021 • Chengyu Wang, Haojie Pan, Minghui Qiu, Jun Huang, Fei Yang, Yin Zhang
For tasks related to distant domains with different class label sets, PLMs may memorize non-transferable knowledge for the target domain and suffer from negative transfer.
no code implementations • 5 Dec 2022 • Mingyuan Fan, Cen Chen, Chengyu Wang, Wenmeng Zhou, Jun Huang, Ximeng Liu, Wenzhong Guo
To craft robust data, Refiner promotes the gradients of critical parameters associated with robust data to close ground-truth ones while leaving the gradients of trivial parameters to safeguard privacy.
no code implementations • 31 Oct 2022 • Zihao Tang, Xinyi Wang, Lihaowen Zhu, Mariano Cabezas, Dongnan Liu, Michael Barnett, Weidong Cai, Chengyu Wang
Diffusion Weighted Imaging (DWI) is an advanced imaging technique commonly used in neuroscience and neurological clinical research through a Diffusion Tensor Imaging (DTI) model.
no code implementations • 17 Oct 2022 • Jianing Wang, Chengcheng Han, Chengyu Wang, Chuanqi Tan, Minghui Qiu, Songfang Huang, Jun Huang, Ming Gao
Few-shot Named Entity Recognition (NER) aims to identify named entities with very little annotated data.
1 code implementation • 11 Oct 2022 • Taolin Zhang, Junwei DOng, Jianing Wang, Chengyu Wang, Ang Wang, Yinghui Liu, Jun Huang, Yong Li, Xiaofeng He
Recently, knowledge-enhanced pre-trained language models (KEPLMs) improve context-aware representations via learning from structured relations in knowledge graphs, and/or linguistic knowledge from syntactic or dependency analysis.
1 code implementation • 27 May 2022 • Tingting Liu, Chengyu Wang, Cen Chen, Ming Gao, Aoying Zhou
With top-$k$ sparse attention, the most crucial attention relation can be obtained with a lower computational cost.
1 code implementation • 11 May 2022 • Jianing Wang, Chengyu Wang, Fuli Luo, Chuanqi Tan, Minghui Qiu, Fei Yang, Qiuhui Shi, Songfang Huang, Ming Gao
Prompt-based fine-tuning has boosted the performance of Pre-trained Language Models (PLMs) on few-shot text classification by employing task-specific prompts.
1 code implementation • 6 May 2022 • Jianing Wang, Chengyu Wang, Minghui Qiu, Qiuhui Shi, Hongbin Wang, Jun Huang, Ming Gao
Extractive Question Answering (EQA) is one of the most important tasks in Machine Reading Comprehension (MRC), which can be solved by fine-tuning the span selecting heads of Pre-trained Language Models (PLMs).
1 code implementation • 30 Apr 2022 • Chengyu Wang, Minghui Qiu, Taolin Zhang, Tingting Liu, Lei LI, Jianing Wang, Ming Wang, Jun Huang, Wei Lin
The success of Pre-Trained Models (PTMs) has reshaped the development of Natural Language Processing (NLP).
1 code implementation • 1 Apr 2022 • Ziyun Xu, Chengyu Wang, Minghui Qiu, Fuli Luo, Runxin Xu, Songfang Huang, Jun Huang
Pre-trained Language Models (PLMs) have achieved remarkable performance for various language understanding tasks in IR systems, which require the fine-tuning process based on labeled training data.
1 code implementation • Findings (ACL) 2022 • Dongyang Li, Taolin Zhang, Nan Hu, Chengyu Wang, Xiaofeng He
In this paper, we propose a hierarchical contrastive learning Framework for Distantly Supervised relation extraction (HiCLRE) to reduce noisy sentences, which integrate the global structural information and local fine-grained interaction.
2 code implementations • 14 Dec 2021 • Runxin Xu, Fuli Luo, Chengyu Wang, Baobao Chang, Jun Huang, Songfang Huang, Fei Huang
Unified in contrastive learning, CAP enables the pruned model to learn from the pre-trained model for task-agnostic knowledge, and fine-tuned model for task-specific knowledge.
1 code implementation • 2 Dec 2021 • Taolin Zhang, Chengyu Wang, Nan Hu, Minghui Qiu, Chengguang Tang, Xiaofeng He, Jun Huang
Knowledge-Enhanced Pre-trained Language Models (KEPLMs) are pre-trained models with relation triples injecting from knowledge graphs to improve language understanding abilities.
no code implementations • 16 Nov 2021 • Jing Shao, Siyu Chen, Yangguang Li, Kun Wang, Zhenfei Yin, Yinan He, Jianing Teng, Qinghong Sun, Mengya Gao, Jihao Liu, Gengshi Huang, Guanglu Song, Yichao Wu, Yuming Huang, Fenggang Liu, Huan Peng, Shuo Qin, Chengyu Wang, Yujie Wang, Conghui He, Ding Liang, Yu Liu, Fengwei Yu, Junjie Yan, Dahua Lin, Xiaogang Wang, Yu Qiao
Enormous waves of technological innovations over the past several years, marked by the advances in AI technologies, are profoundly reshaping the industry and the society.
1 code implementation • 5 Nov 2021 • Chengyu Wang, Minghao Hu, Yuzuru Takashima, Timothy J. Schulz, David J. Brady
We use convolutional neural networks to recover images optically down-sampled by $6. 7\times$ using coherent aperture synthesis over a 16 camera array.
no code implementations • 29 Oct 2021 • Guanglin Niu, Yang Li, Chengguang Tang, Zhongkai Hu, Shibin Yang, Peng Li, Chengyu Wang, Hao Wang, Jian Sun
The multi-relational Knowledge Base Question Answering (KBQA) system performs multi-hop reasoning over the knowledge graph (KG) to achieve the answer.
Knowledge Base Question Answering
Knowledge Graph Embedding
+1
2 code implementations • ACL 2021 • Taolin Zhang, Zerui Cai, Chengyu Wang, Minghui Qiu, Bite Yang, Xiaofeng He
Recently, the performance of Pre-trained Language Models (PLMs) has been significantly improved by injecting knowledge facts to enhance their abilities of language understanding.
1 code implementation • 9 Dec 2020 • David J. Brady, Timothy J. Schulz, Chengyu Wang
Phase-sensitive sensor planes using such devices could eliminate the need both for lenses and reference signals, creating a path to large aperture diffraction limited laser imaging.
1 code implementation • ACL 2021 • Haojie Pan, Chengyu Wang, Minghui Qiu, Yichang Zhang, Yaliang Li, Jun Huang
However, the large model sizes, together with the long inference time, limit the deployment of such models in real-time applications.
no code implementations • 25 Nov 2020 • Haojie Pan, Cen Chen, Chengyu Wang, Minghui Qiu, Liu Yang, Feng Ji, Jun Huang
More specifically, we propose a reinforced selector to extract useful PRF terms to enhance response candidates and a BERT-based response ranker to rank the PRF-enhanced responses.
2 code implementations • 18 Nov 2020 • Minghui Qiu, Peng Li, Chengyu Wang, Hanjie Pan, Ang Wang, Cen Chen, Xianyan Jia, Yaliang Li, Jun Huang, Deng Cai, Wei Lin
The literature has witnessed the success of leveraging Pre-trained Language Models (PLMs) and Transfer Learning (TL) algorithms to a wide range of Natural Language Processing (NLP) applications, yet it is not easy to build an easy-to-use and scalable TL toolkit for this purpose.
no code implementations • 28 Oct 2020 • Yiwu Yao, Yuchao Li, Chengyu Wang, Tianhang Yu, Houjiang Chen, Xiaotang Jiang, Jun Yang, Jun Huang, Wei Lin, Hui Shu, Chengfei Lv
The intensive computation of Automatic Speech Recognition (ASR) models obstructs them from being deployed on mobile devices.
no code implementations • 14 Sep 2020 • Chengyu Wang, Mengli Cheng, Xu Hu, Jun Huang
We present EasyASR, a distributed machine learning platform for training and serving large-scale Automatic Speech Recognition (ASR) models, as well as collecting and processing audio data at scale.
1 code implementation • Findings (ACL) 2021 • Taolin Zhang, Chengyu Wang, Minghui Qiu, Bite Yang, Xiaofeng He, Jun Huang
In this paper, we introduce a multi-target MRC task for the medical domain, whose goal is to predict answers to medical questions and the corresponding support sentences from medical information sources simultaneously, in order to ensure the high reliability of medical knowledge serving.
no code implementations • 4 Aug 2020 • Mengli Cheng, Chengyu Wang, Xu Hu, Jun Huang, Xiaobo Wang
Building Automatic Speech Recognition (ASR) systems from scratch is significantly challenging, mostly due to the time-consuming and financially-expensive process of annotating a large amount of audio data with transcripts.
Automatic Speech Recognition
Optical Character Recognition
+2
no code implementations • ACL 2020 • Chengyu Wang, Xiaofeng He
The hypernymy detection task has been addressed under various frameworks.
2 code implementations • EMNLP 2020 • Chengyu Wang, Minghui Qiu, Jun Huang, Xiaofeng He
In this paper, we propose an effective learning procedure named Meta Fine-Tuning (MFT), served as a meta-learner to solve a group of similar NLP tasks for neural language models.
no code implementations • 25 Feb 2020 • Chengyu Wang, Minghui Qiu, Jun Huang, Xiaofeng He
We further combine a meta-learning process over the auxiliary task distribution and supervised learning to train the neural lexical relation classifier.
no code implementations • 1 Feb 2020 • Xiaojun Cai, Huizhu Song, Zheng Jiao, Hang Yang, Min Zhu, Chengyu Wang, Dong Wei, Lingzhi Shi, Bo Wu, Jinyu Chen
Given the nonlinear kinetics of tacrolimus and large variability, population pharmacokinetic model should be combined with therapeutic drug monitoring to optimize individualized therapy.
no code implementations • ACL 2019 • Chengyu Wang, Xiaofeng He, Aoying Zhou
Lexical relations describe how meanings of terms relate to each other.
2 code implementations • CVPR 2019 • Junting Pan, Chengyu Wang, Xu Jia, Jing Shao, Lu Sheng, Junjie Yan, Xiaogang Wang
This paper proposes the novel task of video generation conditioned on a SINGLE semantic label map, which provides a good balance between flexibility and quality in the generation process.
no code implementations • COLING 2018 • Yan Fan, Chengyu Wang, Xiaofeng He
The goal is to learn a classifier on pre-defined relations and discover new relations expressed in texts.
no code implementations • EMNLP 2017 • Chengyu Wang, Yan Fan, Xiaofeng He, Aoying Zhou
User generated categories (UGCs) are short texts that reflect how people describe and organize entities, expressing rich semantic relations implicitly.
no code implementations • EMNLP 2017 • Chengyu Wang, Xiaofeng He, Aoying Zhou
A taxonomy is a semantic hierarchy, consisting of concepts linked by is-a relations.
no code implementations • ACL 2017 • Chengyu Wang, Junchi Yan, Aoying Zhou, Xiaofeng He
Finding the correct hypernyms for entities is essential for taxonomy learning, fine-grained entity categorization, query understanding, etc.
no code implementations • COLING 2016 • Chengyu Wang, Xiaofeng He
Hypernym-hyponym ({``}is-a{''}) relations are key components in taxonomies, object hierarchies and knowledge graphs.