Search Results for author: Bing Qin

Found 76 papers, 24 papers with code

Neural Natural Logic Inference for Interpretable Question Answering

1 code implementation EMNLP 2021 Jihao Shi, Xiao Ding, Li Du, Ting Liu, Bing Qin

Many open-domain question answering problems can be cast as a textual entailment task, where a question and candidate answers are concatenated to form hypotheses.

Natural Language Inference Open-Domain Question Answering

Learning to Rewrite for Non-Autoregressive Neural Machine Translation

1 code implementation EMNLP 2021 Xinwei Geng, Xiaocheng Feng, Bing Qin

Towards keeping the consistency of data distribution with iterative decoding, an iterative training strategy is employed to further improve the capacity of rewriting.

Machine Translation Translation

Less Is More: Domain Adaptation with Lottery Ticket for Reading Comprehension

1 code implementation Findings (EMNLP) 2021 Haichao Zhu, Zekun Wang, Heng Zhang, Ming Liu, Sendong Zhao, Bing Qin

Then, we only fine-tune the lottery subnetwork, a small fraction of the whole parameters, on the annotated target domain data for adaptation.

Domain Adaptation Reading Comprehension

Distilled Dual-Encoder Model for Vision-Language Understanding

no code implementations16 Dec 2021 Zekun Wang, Wenhui Wang, Haichao Zhu, Ming Liu, Bing Qin, Furu Wei

We propose a cross-modal attention distillation framework to train a dual-encoder model for vision-language understanding tasks, such as visual reasoning and visual question answering.

Question Answering Visual Entailment +2

GEDIT: Geographic-Enhanced and Dependency-Guided Tagging for Joint POI and Accessibility Extraction at Baidu Maps

no code implementations20 Aug 2021 Yibo Sun, Jizhou Huang, Chunyuan Yuan, Miao Fan, Haifeng Wang, Ming Liu, Bing Qin

We approach this task as a sequence tagging problem, where the goal is to produce <POI name, accessibility label> pairs from unstructured text.

Graph Convolutional Network

Learning Event Graph Knowledge for Abductive Reasoning

1 code implementation ACL 2021 Li Du, Xiao Ding, Ting Liu, Bing Qin

Abductive reasoning aims at inferring the most plausible explanation for observed events, which would play critical roles in various NLP applications, such as reading comprehension and question answering.

Question Answering Reading Comprehension

ExCAR: Event Graph Knowledge Enhanced Explainable Causal Reasoning

no code implementations ACL 2021 Li Du, Xiao Ding, Kai Xiong, Ting Liu, Bing Qin

ExCAR first acquires additional evidence information from a large-scale causal event graph as logical rules for causal reasoning.

Representation Learning

CausalBERT: Injecting Causal Knowledge Into Pre-trained Models with Minimal Supervision

no code implementations21 Jul 2021 Zhongyang Li, Xiao Ding, Kuo Liao, Bing Qin, Ting Liu

Recent work has shown success in incorporating pre-trained models like BERT to improve NLP systems.

Causal Inference

A Survey on Dialogue Summarization: Recent Advances and New Frontiers

no code implementations7 Jul 2021 Xiachong Feng, Xiaocheng Feng, Bing Qin

We hope that this first survey of dialogue summarization can provide the community with a quick access and a general picture to this task and motivate future researches.

Text Generation

Language Model as an Annotator: Exploring DialoGPT for Dialogue Summarization

1 code implementation ACL 2021 Xiachong Feng, Xiaocheng Feng, Libo Qin, Bing Qin, Ting Liu

Current dialogue summarization systems usually encode the text with a number of general semantic features (e. g., keywords and topics) to gain more powerful dialogue modeling capabilities.

Conversational Response Generation Language Modelling

The Factual Inconsistency Problem in Abstractive Text Summarization: A Survey

1 code implementation30 Apr 2021 Yichong Huang, Xiachong Feng, Xiaocheng Feng, Bing Qin

Recently, various neural encoder-decoder models pioneered by Seq2Seq framework have been proposed to achieve the goal of generating more abstractive summaries by learning to map input text to output text.

Abstractive Text Summarization

DADgraph: A Discourse-aware Dialogue Graph Neural Network for Multiparty Dialogue Machine Reading Comprehension

no code implementations26 Apr 2021 Jiaqi Li, Ming Liu, Zihao Zheng, Heng Zhang, Bing Qin, Min-Yen Kan, Ting Liu

Multiparty Dialogue Machine Reading Comprehension (MRC) differs from traditional MRC as models must handle the complex dialogue discourse structure, previously unconsidered in traditional MRC.

Machine Reading Comprehension

Learning to Share by Masking the Non-shared for Multi-domain Sentiment Classification

no code implementations17 Apr 2021 Jianhua Yuan, Yanyan Zhao, Bing Qin, Ting Liu

To this end, we propose the BertMasker network which explicitly masks domain-related words from texts, learns domain-invariant sentiment features from these domain-agnostic texts, and uses those masked words to form domain-aware sentence representations.

General Classification Multi-Domain Sentiment Classification +1

Dialogue Discourse-Aware Graph Model and Data Augmentation for Meeting Summarization

1 code implementation7 Dec 2020 Xiachong Feng, Xiaocheng Feng, Bing Qin, Xinwei Geng

First, we present a Dialogue Discourse-Dware Meeting Summarizer (DDAMS) to explicitly model the interaction between utterances in a meeting by modeling different discourse relations.

Data Augmentation Meeting Summarization

Biomedical Knowledge Graph Refinement with Embedding and Logic Rules

no code implementations2 Dec 2020 Sendong Zhao, Bing Qin, Ting Liu, Fei Wang

This paper proposes a method BioGRER to improve the BioKG's quality, which comprehensively combines the knowledge graph embedding and logic rules that support and negate triplets in the BioKG.

Knowledge Graph Embedding Knowledge Graphs

TableGPT: Few-shot Table-to-Text Generation with Table Structure Reconstruction and Content Matching

no code implementations COLING 2020 Heng Gong, Yawei Sun, Xiaocheng Feng, Bing Qin, Wei Bi, Xiaojiang Liu, Ting Liu

Although neural table-to-text models have achieved remarkable progress with the help of large-scale datasets, they suffer insufficient learning problem with limited training data.

Few-Shot Learning Language Modelling +2

An Iterative Emotion Interaction Network for Emotion Recognition in Conversations

no code implementations COLING 2020 Xin Lu, Yanyan Zhao, Yang Wu, Yijian Tian, Huipeng Chen, Bing Qin

We noticed that the gold emotion labels of the context utterances can provide explicit and accurate emotion interaction, but it is impossible to input gold labels at inference time.

Emotion Recognition in Conversation

Incorporating Commonsense Knowledge into Abstractive Dialogue Summarization via Heterogeneous Graph Networks

1 code implementation CCL 2021 Xiachong Feng, Xiaocheng Feng, Bing Qin, Ting Liu

In detail, we consider utterance and commonsense knowledge as two different types of data and design a Dialogue Heterogeneous Graph Network (D-HGN) for modeling both information.

Abstractive Dialogue Summarization dialogue summary +1

How Does Selective Mechanism Improve Self-Attention Networks?

1 code implementation ACL 2020 Xinwei Geng, Long-Yue Wang, Xing Wang, Bing Qin, Ting Liu, Zhaopeng Tu

Self-attention networks (SANs) with selective mechanism has produced substantial improvements in various NLP tasks by concentrating on a subset of input words.

Machine Translation Natural Language Inference +1

Revisiting Pre-Trained Models for Chinese Natural Language Processing

6 code implementations Findings of the Association for Computational Linguistics 2020 Yiming Cui, Wanxiang Che, Ting Liu, Bing Qin, Shijin Wang, Guoping Hu

Bidirectional Encoder Representations from Transformers (BERT) has shown marvelous improvements across various NLP tasks, and consecutive variants have been proposed to further improve the performance of the pre-trained language models.

Language Modelling

Learning to Select Bi-Aspect Information for Document-Scale Text Content Manipulation

1 code implementation24 Feb 2020 Xiaocheng Feng, Yawei Sun, Bing Qin, Heng Gong, Yibo Sun, Wei Bi, Xiaojiang Liu, Ting Liu

In this paper, we focus on a new practical task, document-scale text content manipulation, which is the opposite of text style transfer and aims to preserve text styles while altering the content.

Style Transfer Text Style Transfer +1

An Annotation Scheme of A Large-scale Multi-party Dialogues Dataset for Discourse Parsing and Machine Comprehension

no code implementations8 Nov 2019 Jiaqi Li, Ming Liu, Bing Qin, Zihao Zheng, Ting Liu

In this paper, we propose the scheme for annotating large-scale multi-party chat dialogues for discourse parsing and machine comprehension.

Discourse Parsing Machine Reading Comprehension

Transforming Wikipedia into Augmented Data for Query-Focused Summarization

no code implementations8 Nov 2019 Haichao Zhu, Li Dong, Furu Wei, Bing Qin, Ting Liu

The manual construction of a query-focused summarization corpus is costly and timeconsuming.

Data Augmentation

Multi-Input Multi-Output Sequence Labeling for Joint Extraction of Fact and Condition Tuples from Scientific Text

no code implementations IJCNLP 2019 Tianwen Jiang, Tong Zhao, Bing Qin, Ting Liu, Nitesh Chawla, Meng Jiang

In this work, we propose a new sequence labeling framework (as well as a new tag schema) to jointly extract the fact and condition tuples from statement sentences.

TAG

Neural Semantic Parsing in Low-Resource Settings with Back-Translation and Meta-Learning

no code implementations12 Sep 2019 Yibo Sun, Duyu Tang, Nan Duan, Yeyun Gong, Xiaocheng Feng, Bing Qin, Daxin Jiang

Neural semantic parsing has achieved impressive results in recent years, yet its success relies on the availability of large amounts of supervised data.

Meta-Learning Semantic Parsing +1

Table-to-Text Generation with Effective Hierarchical Encoder on Three Dimensions (Row, Column and Time)

1 code implementation IJCNLP 2019 Heng Gong, Xiaocheng Feng, Bing Qin, Ting Liu

To address aforementioned problems, not only do we model each table cell considering other records in the same row, we also enrich table's representation by modeling each table cell in context of other cells in the same column or with historical (time dimension) data respectively.

Table-to-Text Generation Time Series

Pre-Training with Whole Word Masking for Chinese BERT

2 code implementations19 Jun 2019 Yiming Cui, Wanxiang Che, Ting Liu, Bing Qin, Ziqing Yang

To demonstrate the effectiveness of these models, we create a series of Chinese pre-trained language models as our baselines, including BERT, RoBERTa, ELECTRA, RBT, etc.

Document Classification General Classification +5

Learning to Ask Unanswerable Questions for Machine Reading Comprehension

no code implementations ACL 2019 Haichao Zhu, Li Dong, Furu Wei, Wenhui Wang, Bing Qin, Ting Liu

We also present a way to construct training data for our question generation models by leveraging the existing reading comprehension dataset.

Data Augmentation Machine Reading Comprehension +1

Attribute Acquisition in Ontology based on Representation Learning of Hierarchical Classes and Attributes

no code implementations8 Mar 2019 Tianwen Jiang, Ming Liu, Bing Qin, Ting Liu

This paper investigates an attention-based automatic paradigm called TransATT for attribute acquisition, by learning the representation of hierarchical classes and attributes in Chinese ontology.

Representation Learning

Learning to Refine Source Representations for Neural Machine Translation

no code implementations26 Dec 2018 Xinwei Geng, Long-Yue Wang, Xing Wang, Bing Qin, Ting Liu, Zhaopeng Tu

Neural machine translation (NMT) models generally adopt an encoder-decoder architecture for modeling the entire translation process.

Machine Translation Translation

An AMR Aligner Tuned by Transition-based Parser

1 code implementation EMNLP 2018 Yijia Liu, Wanxiang Che, Bo Zheng, Bing Qin, Ting Liu

In this paper, we propose a new rich resource enhanced AMR aligner which produces multiple alignments and a new transition system for AMR parsing along with its oracle parser.

AMR Parsing POS

Adaptive Multi-pass Decoder for Neural Machine Translation

no code implementations EMNLP 2018 Xinwei Geng, Xiaocheng Feng, Bing Qin, Ting Liu

Although end-to-end neural machine translation (NMT) has achieved remarkable progress in the recent years, the idea of adopting multi-pass decoding mechanism into conventional NMT is not well explored.

Machine Translation Translation

Knowledge Based Machine Reading Comprehension

no code implementations12 Sep 2018 Yibo Sun, Daya Guo, Duyu Tang, Nan Duan, Zhao Yan, Xiaocheng Feng, Bing Qin

Machine reading comprehension (MRC) requires reasoning about both the knowledge involved in a document and knowledge about the world.

Machine Reading Comprehension Question Answering +1

Truth Discovery with Memory Network

no code implementations7 Nov 2016 Luyang Li, Bing Qin, Wenjing Ren, Ting Liu

We use feedforward memory network and feedback memory network to learn the representation of the credibility of statements which are about the same object.

Aspect Level Sentiment Classification with Deep Memory Network

7 code implementations EMNLP 2016 Duyu Tang, Bing Qin, Ting Liu

Such importance degree and text representation are calculated with multiple computational layers, each of which is a neural attention model over an external memory.

Aspect-Based Sentiment Analysis General Classification

A Planning based Framework for Essay Generation

no code implementations18 Dec 2015 Bing Qin, Duyu Tang, Xinwei Geng, Dandan Ning, Jiahao Liu, Ting Liu

Generating an article automatically with computer program is a challenging task in artificial intelligence and natural language processing.

Emotion Analysis Platform on Chinese Microblog

no code implementations28 Mar 2014 Duyu Tang, Bing Qin, Ting Liu, Qiuhui Shi

In order to analyze the emotional changes in accordance with time and space, this paper presents an Emotion Analysis Platform (EAP), which explores the emotional distribution of each province, so that can monitor the global pulse of each province in China.

Emotion Recognition

Cannot find the paper you are looking for? You can Submit a new open access paper.