Search Results for author: Shen Gao

Found 30 papers, 15 papers with code

Confucius: Iterative Tool Learning from Introspection Feedback by Easy-to-Difficult Curriculum

no code implementations27 Aug 2023 Shen Gao, Zhengliang Shi, Minghang Zhu, Bowen Fang, Xin Xin, Pengjie Ren, Zhumin Chen, Jun Ma

Augmenting large language models (LLMs) with external tools has emerged as a promising approach to extending the capability of LLMs.

UMSE: Unified Multi-scenario Summarization Evaluation

1 code implementation26 May 2023 Shen Gao, Zhitao Yao, Chongyang Tao, Xiuying Chen, Pengjie Ren, Zhaochun Ren, Zhumin Chen

Experimental results across three typical scenarios on the benchmark dataset SummEval indicate that our UMSE can achieve comparable performance with several existing strong methods which are specifically designed for each scenario.

Text Summarization

A Topic-aware Summarization Framework with Different Modal Side Information

no code implementations19 May 2023 Xiuying Chen, Mingzhe Li, Shen Gao, Xin Cheng, Qiang Yang, Qishen Zhang, Xin Gao, Xiangliang Zhang

To address these two challenges, we first propose a unified topic encoder, which jointly discovers latent topics from the document and various kinds of side information.

Contrastive Learning

Follow the Timeline! Generating Abstractive and Extractive Timeline Summary in Chronological Order

1 code implementation2 Jan 2023 Xiuying Chen, Mingzhe Li, Shen Gao, Zhangming Chan, Dongyan Zhao, Xin Gao, Xiangliang Zhang, Rui Yan

Nowadays, time-stamped web documents related to a general news query floods spread throughout the Internet, and timeline summarization targets concisely summarizing the evolution trajectory of events along the timeline.

Document Summarization Timeline Summarization +1

Contrastive Learning Reduces Hallucination in Conversations

1 code implementation20 Dec 2022 Weiwei Sun, Zhengliang Shi, Shen Gao, Pengjie Ren, Maarten de Rijke, Zhaochun Ren

MixCL effectively reduces the hallucination of LMs in conversations and achieves the highest performance among LM-based dialogue agents in terms of relevancy and factuality.

Contrastive Learning

Scientific Paper Extractive Summarization Enhanced by Citation Graphs

no code implementations8 Dec 2022 Xiuying Chen, Mingzhe Li, Shen Gao, Rui Yan, Xin Gao, Xiangliang Zhang

We first propose a Multi-granularity Unsupervised Summarization model (MUS) as a simple and low-cost solution to the task.

Extractive Summarization Link Prediction

Neural Machine Translation with Contrastive Translation Memories

1 code implementation6 Dec 2022 Xin Cheng, Shen Gao, Lemao Liu, Dongyan Zhao, Rui Yan

Retrieval-augmented Neural Machine Translation models have been successful in many translation scenarios.

Contrastive Learning Machine Translation +3

Target-aware Abstractive Related Work Generation with Contrastive Learning

1 code implementation26 May 2022 Xiuying Chen, Hind Alamro, Mingzhe Li, Shen Gao, Rui Yan, Xin Gao, Xiangliang Zhang

The related work section is an important component of a scientific paper, which highlights the contribution of the target paper in the context of the reference papers.

Contrastive Learning TAG

Capturing Relations between Scientific Papers: An Abstractive Model for Related Work Section Generation

1 code implementation ACL 2021 Xiuying Chen, Hind Alamro, Mingzhe Li, Shen Gao, Xiangliang Zhang, Dongyan Zhao, Rui Yan

Hence, in this paper, we propose a Relation-aware Related work Generator (RRG), which generates an abstractive related work from the given multiple scientific papers in the same research area.

Learning to Organize a Bag of Words into Sentences with Neural Networks: An Empirical Study

no code implementations NAACL 2021 Chongyang Tao, Shen Gao, Juntao Li, Yansong Feng, Dongyan Zhao, Rui Yan

Sequential information, a. k. a., orders, is assumed to be essential for processing a sequence with recurrent neural network or convolutional neural network based encoders.

Deep Multi-Stage CSI Acquisition for Reconfigurable Intelligent Surface Aided MIMO Systems

no code implementations23 Apr 2021 Shen Gao, Peihao Dong, Zhiwen Pan, Geoffrey Ye Li

This article aims to reduce huge pilot overhead when estimating the reconfigurable intelligent surface (RIS) relayed wireless channel.

The Style-Content Duality of Attractiveness: Learning to Write Eye-Catching Headlines via Disentanglement

no code implementations14 Dec 2020 Mingzhe Li, Xiuying Chen, Min Yang, Shen Gao, Dongyan Zhao, Rui Yan

In this paper, we propose a Disentanglement-based Attractive Headline Generator (DAHG) that generates headline which captures the attractive content following the attractive style.


Meaningful Answer Generation of E-Commerce Question-Answering

no code implementations14 Nov 2020 Shen Gao, Xiuying Chen, Zhaochun Ren, Dongyan Zhao, Rui Yan

To generate more meaningful answers, in this paper, we propose a novel generative neural model, called the Meaningful Product Answer Generator (MPAG), which alleviates the safe answer problem by taking product reviews, product attributes, and a prototype answer into consideration.

Answer Generation Question Answering +1

Learning to Respond with Your Favorite Stickers: A Framework of Unifying Multi-Modality and User Preference in Multi-Turn Dialog

no code implementations5 Nov 2020 Shen Gao, Xiuying Chen, Li Liu, Dongyan Zhao, Rui Yan

Hence, in this paper, we propose to recommend an appropriate sticker to user based on multi-turn dialog context and sticker using history of user.

VMSMO: Learning to Generate Multimodal Summary for Video-based News Articles

1 code implementation EMNLP 2020 Mingzhe Li, Xiuying Chen, Shen Gao, Zhangming Chan, Dongyan Zhao, Rui Yan

Hence, in this paper, we propose the task of Video-based Multimodal Summarization with Multimodal Output (VMSMO) to tackle such a problem.

From Standard Summarization to New Tasks and Beyond: Summarization with Manifold Information

no code implementations10 May 2020 Shen Gao, Xiuying Chen, Zhaochun Ren, Dongyan Zhao, Rui Yan

Text summarization is the research area aiming at creating a short and condensed version of the original document, which conveys the main idea of the document in a few words.

Text Summarization

Learning to Respond with Stickers: A Framework of Unifying Multi-Modality in Multi-Turn Dialog

1 code implementation10 Mar 2020 Shen Gao, Xiuying Chen, Chang Liu, Li Liu, Dongyan Zhao, Rui Yan

Stickers with vivid and engaging expressions are becoming increasingly popular in online messaging apps, and some works are dedicated to automatically select sticker response by matching text labels of stickers with previous utterances.

Reinforcement Learning Based Cooperative Coded Caching under Dynamic Popularities in Ultra-Dense Networks

no code implementations8 Mar 2020 Shen Gao, Peihao Dong, Zhiwen Pan, Geoffrey Ye Li

For ultra-dense networks with wireless backhaul, caching strategy at small base stations (SBSs), usually with limited storage, is critical to meet massive high data rate requests.

Q-Learning Reinforcement Learning (RL)

RPM-Oriented Query Rewriting Framework for E-commerce Keyword-Based Sponsored Search

no code implementations28 Oct 2019 Xiuying Chen, Daorui Xiao, Shen Gao, Guojun Liu, Wei. Lin, Bo Zheng, Dongyan Zhao, Rui Yan

Sponsored search optimizes revenue and relevance, which is estimated by Revenue Per Mille (RPM).

How to Write Summaries with Patterns? Learning towards Abstractive Summarization through Prototype Editing

1 code implementation IJCNLP 2019 Shen Gao, Xiuying Chen, Piji Li, Zhangming Chan, Dongyan Zhao, Rui Yan

There are two main challenges in this task: (1) the model needs to incorporate learned patterns from the prototype, but (2) should avoid copying contents other than the patternized words---such as irrelevant facts---into the generated summaries.

Abstractive Text Summarization

Learning towards Abstractive Timeline Summarization

1 code implementation IJCAI 2019 2019 Xiuying Chen, Zhangming Chan, Shen Gao, Meng-Hsuan Yu, Dongyan Zhao, Rui Yan

Timeline summarization targets at concisely summarizing the evolution trajectory along the timeline and existing timeline summarization approaches are all based on extractive methods. In this paper, we propose the task of abstractive timeline summarization, which tends to concisely paraphrase the information in the time-stamped events. Unlike traditional document summarization, timeline summarization needs to model the time series information of the input events and summarize important events in chronological order. To tackle this challenge, we propose a memory-based timeline summarization model (MTS). Concretely, we propose a time-event memory to establish a timeline, and use the time position of events on this timeline to guide generation process. Besides, in each decoding step, we incorporate event-level information into word-level attention to avoid confusion between events. Extensive experiments are conducted on a large-scale real-world dataset, and the results show that MTS achieves the state-of-the-art performance in terms of both automatic and human evaluations.

Document Summarization Timeline Summarization +2

Product-Aware Answer Generation in E-Commerce Question-Answering

1 code implementation23 Jan 2019 Shen Gao, Zhaochun Ren, Yihong Eric Zhao, Dongyan Zhao, Dawei Yin, Rui Yan

In this paper, we propose the task of product-aware answer generation, which tends to generate an accurate and complete answer from large-scale unlabeled e-commerce reviews and product attributes.

Answer Generation Question Answering

Abstractive Text Summarization by Incorporating Reader Comments

no code implementations13 Dec 2018 Shen Gao, Xiuying Chen, Piji Li, Zhaochun Ren, Lidong Bing, Dongyan Zhao, Rui Yan

To tackle this problem, we propose the task of reader-aware abstractive summary generation, which utilizes the reader comments to help the model produce better summary about the main aspect.

Reader-Aware Summarization

Iterative Document Representation Learning Towards Summarization with Polishing

1 code implementation EMNLP 2018 Xiuying Chen, Shen Gao, Chongyang Tao, Yan Song, Dongyan Zhao, Rui Yan

In this paper, we introduce Iterative Text Summarization (ITS), an iteration-based model for supervised extractive text summarization, inspired by the observation that it is often necessary for a human to read an article multiple times in order to fully understand and summarize its contents.

Extractive Text Summarization Representation Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.