Search Results for author: Xian-Ling Mao

Found 73 papers, 34 papers with code

Conceptualization Topic Modeling

no code implementations7 Apr 2017 Yi-Kun Tang, Xian-Ling Mao, He-Yan Huang, Guihua Wen

Recently, topic modeling has been widely used to discover the abstract topics in text corpora.

Topic Models

Supervised Deep Hashing for Hierarchical Labeled Data

no code implementations7 Apr 2017 Dan Wang, He-Yan Huang, Chi Lu, Bo-Si Feng, Liqiang Nie, Guihua Wen, Xian-Ling Mao

Specifically, we define a novel similarity formula for hierarchical labeled data by weighting each layer, and design a deep convolutional neural network to obtain a hash code for each data point.

Deep Hashing Image Retrieval

Object Detection based Deep Unsupervised Hashing

no code implementations24 Nov 2018 Rong-Cheng Tu, Xian-Ling Mao, Bo-Si Feng, Bing-Bing Bian, Yu-shu Ying

Recently, similarity-preserving hashing methods have been extensively studied for large-scale image retrieval.

Image Retrieval Novel Object Detection +4

Deep Cross-Modal Hashing with Hashing Functions and Unified Hash Codes Jointly Learning

no code implementations29 Jul 2019 Rong-Cheng Tu, Xian-Ling Mao, Bing Ma, Yong Hu, Tan Yan, Wei Wei, He-Yan Huang

Specifically, by an iterative optimization algorithm, DCHUC jointly learns unified hash codes for image-text pairs in a database and a pair of hash functions for unseen query image-text pairs.

Retrieval

Deep Hashing for Signed Social Network Embedding

no code implementations12 Aug 2019 Jia-Nan Guo, Xian-Ling Mao, Xiao-Jian Jiang, Ying-Xiang Sun, Wei Wei, He-Yan Huang

Network embedding is a promising way of network representation, facilitating many signed social network processing and analysis tasks such as link prediction and node classification.

Deep Hashing Link Prediction +2

Complicated Table Structure Recognition

1 code implementation13 Aug 2019 Zewen Chi, He-Yan Huang, Heng-Da Xu, Houjin Yu, Wanxuan Yin, Xian-Ling Mao

It also attracts lots of attention to recognize the table structures in PDF files.

Multi-task Learning for Low-resource Second Language Acquisition Modeling

1 code implementation25 Aug 2019 Yong Hu, He-Yan Huang, Tian Lan, Xiaochi Wei, Yuxiang Nie, Jiarui Qi, Liner Yang, Xian-Ling Mao

Second language acquisition (SLA) modeling is to predict whether second language learners could correctly answer the questions according to what they have learned.

Language Acquisition Multi-Task Learning

Stack-VS: Stacked Visual-Semantic Attention for Image Caption Generation

no code implementations5 Sep 2019 Wei Wei, Ling Cheng, Xian-Ling Mao, Guangyou Zhou, Feida Zhu

Recently, automatic image caption generation has been an important focus of the work on multimodal translation task.

Attribute Caption Generation

Generative Dialog Policy for Task-oriented Dialog Systems

no code implementations17 Sep 2019 Tian Lan, Xian-Ling Mao, He-Yan Huang

As far as we know, the existing task-oriented dialogue systems obtain the dialogue policy through classification, which can assign either a dialogue act and its corresponding parameters or multiple dialogue acts without their corresponding parameters for a dialogue action.

General Classification Task-Oriented Dialogue Systems

Cross-Lingual Natural Language Generation via Pre-Training

1 code implementation23 Sep 2019 Zewen Chi, Li Dong, Furu Wei, Wenhui Wang, Xian-Ling Mao, He-Yan Huang

In this work we focus on transferring supervision signals of natural language generation (NLG) tasks between multiple languages.

Abstractive Text Summarization Machine Translation +5

SEPT: Improving Scientific Named Entity Recognition with Span Representation

no code implementations8 Nov 2019 Tan Yan, He-Yan Huang, Xian-Ling Mao

We introduce a new scientific named entity recognizer called SEPT, which stands for Span Extractor with Pre-trained Transformers.

named-entity-recognition Named Entity Recognition +1

DialogAct2Vec: Towards End-to-End Dialogue Agent by Multi-Task Representation Learning

no code implementations11 Nov 2019 Zhuoxuan Jiang, Ziming Huang, Dong Sheng Li, Xian-Ling Mao

In this paper, we propose a novel joint end-to-end model by multi-task representation learning, which can capture the knowledge from heterogeneous information through automatically learning knowledgeable low-dimensional embeddings from data, named with DialogAct2Vec.

Multi-Task Learning Representation Learning

When to Talk: Chatbot Controls the Timing of Talking during Multi-turn Open-domain Dialogue Generation

no code implementations20 Dec 2019 Tian Lan, Xian-Ling Mao, He-Yan Huang, Wei Wei

Intuitively, a dialogue model that can control the timing of talking autonomously based on the conversation context can chat with humans more naturally.

Dialogue Generation

PONE: A Novel Automatic Evaluation Metric for Open-Domain Generative Dialogue Systems

1 code implementation6 Apr 2020 Tian Lan, Xian-Ling Mao, Wei Wei, Xiaoyan Gao, He-Yan Huang

Through extensive experiments, the learning-based metrics are demonstrated that they are the most effective evaluation metrics for open-domain generative dialogue systems.

Dialogue Evaluation

Learning Relation Ties with a Force-Directed Graph in Distant Supervised Relation Extraction

no code implementations21 Apr 2020 Yuming Shang, Heyan Huang, Xin Sun, Xian-Ling Mao

Then, we borrow the idea of Coulomb's Law from physics and introduce the concept of attractive force and repulsive force to this graph to learn correlation and mutual exclusion between relations.

Relation Relation Extraction

Weibo-COV: A Large-Scale COVID-19 Social Media Dataset from Weibo

1 code implementation EMNLP (NLP-COVID19) 2020 Yong Hu, He-Yan Huang, Anfan Chen, Xian-Ling Mao

Therefore, in this paper, we release Weibo-COV, a first large-scale COVID-19 social media dataset from Weibo, covering more than 30 million tweets from 1 November 2019 to 30 April 2020.

Social and Information Networks

Generating Informative Dialogue Responses with Keywords-Guided Networks

no code implementations3 Jul 2020 Heng-Da Xu, Xian-Ling Mao, Zewen Chi, Jing-Jing Zhu, Fanshu Sun, He-Yan Huang

Specifically, KW-Seq2Seq first uses a keywords decoder to predict some topic keywords, and then generates the final response under the guidance of them.

InfoXLM: An Information-Theoretic Framework for Cross-Lingual Language Model Pre-Training

4 code implementations NAACL 2021 Zewen Chi, Li Dong, Furu Wei, Nan Yang, Saksham Singhal, Wenhui Wang, Xia Song, Xian-Ling Mao, He-Yan Huang, Ming Zhou

In this work, we present an information-theoretic framework that formulates cross-lingual language model pre-training as maximizing mutual information between multilingual-multi-granularity texts.

Contrastive Learning Cross-Lingual Transfer +2

Which Kind Is Better in Open-domain Multi-turn Dialog,Hierarchical or Non-hierarchical Models? An Empirical Study

no code implementations7 Aug 2020 Tian Lan, Xian-Ling Mao, Wei Wei, He-Yan Huang

Thus, in this paper, we will measure systematically nearly all representative hierarchical and non-hierarchical models over the same experimental settings to check which kind is better.

Deep Kernel Supervised Hashing for Node Classification in Structural Networks

no code implementations26 Oct 2020 Jia-Nan Guo, Xian-Ling Mao, Shu-Yang Lin, Wei Wei, Heyan Huang

However, nearly all the existing network embedding based methods are hard to capture the actual category features of a node because of the linearly inseparable problem in low-dimensional space; meanwhile they cannot incorporate simultaneously network structure information and node label information into network embedding.

Classification General Classification +2

Deep Cross-modal Hashing via Margin-dynamic-softmax Loss

no code implementations6 Nov 2020 Rong-Cheng Tu, Xian-Ling Mao, Rongxin Tu, Binbin Bian, Wei Wei, Heyan Huang

Finally, by minimizing the novel \textit{margin-dynamic-softmax loss}, the modality-specific hashing networks can be trained to generate hash codes which can simultaneously preserve the cross-modal similarity and abundant semantic information well.

Cross-Modal Retrieval Retrieval

User-based Network Embedding for Collective Opinion Spammer Detection

no code implementations16 Nov 2020 Ziyang Wang, Wei Wei, Xian-Ling Mao, Guibing Guo, Pan Zhou, Shanshan Feng

Due to the huge commercial interests behind online reviews, a tremendousamount of spammers manufacture spam reviews for product reputation manipulation.

Network Embedding Relation

Exploring Global Information for Session-based Recommendation

no code implementations20 Nov 2020 Ziyang Wang, Wei Wei, Gao Cong, Xiao-Li Li, Xian-Ling Mao, Minghui Qiu, Shanshan Feng

Based on BGNN, we propose a novel approach, called Session-based Recommendation with Global Information (SRGI), which infers the user preferences via fully exploring global item-transitions over all sessions from two different perspectives: (i) Fusion-based Model (SRGI-FM), which recursively incorporates the neighbor embeddings of each node on global graph into the learning process of session level item representation; and (ii) Constrained-based Model (SRGI-CM), which treats the global-level item-transition information as a constraint to ensure the learned item embeddings are consistent with the global item-transition.

Session-Based Recommendations

Exploiting Group-level Behavior Pattern forSession-based Recommendation

no code implementations10 Dec 2020 Ziyang Wang, Wei Wei, Xian-Ling Mao, Xiao-Li Li, Shanshan Feng

In RNMSR, we propose to learn the user preference from both instance-level and group-level, respectively: (i) instance-level, which employs GNNs on a similarity-based item-pairwise session graph to capture the users' preference in instance-level.

Representation Learning Session-Based Recommendations

Ultra-Fast, Low-Storage, Highly Effective Coarse-grained Selection in Retrieval-based Chatbot by Using Deep Semantic Hashing

1 code implementation17 Dec 2020 Tian Lan, Xian-Ling Mao, Xiaoyan Gao, Wei Wei, Heyan Huang

Specifically, in our proposed DSHC model, a hashing optimizing module that consists of two autoencoder models is stacked on a trained dense representation model, and three loss functions are designed to optimize it.

Chatbot Open-Ended Question Answering +1

Self-attention Comparison Module for Boosting Performance on Retrieval-based Open-Domain Dialog Systems

no code implementations21 Dec 2020 Tian Lan, Xian-Ling Mao, Zhipeng Zhao, Wei Wei, Heyan Huang

Since the pre-trained language models are widely used, retrieval-based open-domain dialog systems, have attracted considerable attention from researchers recently.

Open-Domain Dialog Retrieval

A Robust and Domain-Adaptive Approach for Low-Resource Named Entity Recognition

1 code implementation2 Jan 2021 Houjin Yu, Xian-Ling Mao, Zewen Chi, Wei Wei, Heyan Huang

Recently, it has attracted much attention to build reliable named entity recognition (NER) systems using limited annotated data.

Ranked #3 on Named Entity Recognition (NER) on SciERC (using extra training data)

Low Resource Named Entity Recognition named-entity-recognition +2

Global Context Enhanced Graph Neural Networks for Session-based Recommendation

2 code implementations9 Jun 2021 Ziyang Wang, Wei Wei, Gao Cong, Xiao-Li Li, Xian-Ling Mao, Minghui Qiu

In GCE-GNN, we propose a novel global-level item representation learning layer, which employs a session-aware attention mechanism to recursively incorporate the neighbors' embeddings of each node on the global graph.

Representation Learning Session-Based Recommendations

Context-aware Entity Typing in Knowledge Graphs

1 code implementation Findings (EMNLP) 2021 Weiran Pan, Wei Wei, Xian-Ling Mao

Knowledge graph entity typing aims to infer entities' missing types in knowledge graphs which is an important but under-explored issue.

Entity Typing Knowledge Graphs

Cross-Lingual Language Model Meta-Pretraining

no code implementations23 Sep 2021 Zewen Chi, Heyan Huang, Luyang Liu, Yu Bai, Xian-Ling Mao

The success of pretrained cross-lingual language models relies on two essential abilities, i. e., generalization ability for learning downstream tasks in a source language, and cross-lingual transferability for transferring the task knowledge to other languages.

Cross-Lingual Transfer Language Modelling

Exploring Dense Retrieval for Dialogue Response Selection

1 code implementation13 Oct 2021 Tian Lan, Deng Cai, Yan Wang, Yixuan Su, Heyan Huang, Xian-Ling Mao

In this study, we present a solution to directly select proper responses from a large corpus or even a nonparallel corpus that only consists of unpaired sentences, using a dense retrieval model.

Conversational Response Selection Retrieval

OneRel:Joint Entity and Relation Extraction with One Module in One Step

no code implementations10 Mar 2022 Yu-Ming Shang, Heyan Huang, Xian-Ling Mao

Joint entity and relation extraction is an essential task in natural language processing and knowledge graph construction.

graph construction Joint Entity and Relation Extraction +2

Hammer PDF: An Intelligent PDF Reader for Scientific Papers

no code implementations6 Apr 2022 Sheng-Fu Wang, Shu-Hang Liu, Tian-Yi Che, Yi-Fan Lu, Song-Xiao Yang, Heyan Huang, Xian-Ling Mao

Specifically, taking a paper as a basic and separate unit, existing PDF Readers cannot access extended information about the paper, such as corresponding videos, blogs and codes.

BiSyn-GAT+: Bi-Syntax Aware Graph Attention Network for Aspect-based Sentiment Analysis

1 code implementation Findings (ACL) 2022 Shuo Liang, Wei Wei, Xian-Ling Mao, Fei Wang, Zhiyong He

Aspect-based sentiment analysis (ABSA) is a fine-grained sentiment analysis task that aims to align aspects and corresponding sentiments for aspect-specific sentiment polarity inference.

Aspect-Based Sentiment Analysis Aspect-Based Sentiment Analysis (ABSA) +2

Cross-Lingual Phrase Retrieval

1 code implementation ACL 2022 Heqi Zheng, Xiao Zhang, Zewen Chi, Heyan Huang, Tan Yan, Tian Lan, Wei Wei, Xian-Ling Mao

In this paper, we propose XPR, a cross-lingual phrase retriever that extracts phrase representations from unlabeled example sentences.

Retrieval Sentence

Multi-level Cross-view Contrastive Learning for Knowledge-aware Recommender System

1 code implementation19 Apr 2022 Ding Zou, Wei Wei, Xian-Ling Mao, Ziyang Wang, Minghui Qiu, Feida Zhu, Xin Cao

Different from traditional contrastive learning methods which generate two graph views by uniform data augmentation schemes such as corruption or dropping, we comprehensively consider three different graph views for KG-aware recommendation, including global-level structural view, local-level collaborative and semantic views.

Contrastive Learning Data Augmentation +2

Relational Triple Extraction: One Step is Enough

no code implementations11 May 2022 Yu-Ming Shang, Heyan Huang, Xin Sun, Wei Wei, Xian-Ling Mao

Extracting relational triples from unstructured text is an essential task in natural language processing and knowledge graph construction.

graph construction Sentence

Person-job fit estimation from candidate profile and related recruitment history with co-attention neural networks

1 code implementation18 Jun 2022 Ziyang Wang, Wei Wei, Chenwei Xu, Jun Xu, Xian-Ling Mao

Existing studies on person-job fit, however, mainly focus on calculating the similarity between the candidate resumes and the job postings on the basis of their contents, without taking the recruiters' experience (i. e., historical successful recruitment records) into consideration.

Unsupervised Question Answering via Answer Diversifying

1 code implementation COLING 2022 Yuxiang Nie, Heyan Huang, Zewen Chi, Xian-Ling Mao

Previous works usually make use of heuristic rules as well as pre-trained models to construct data and train QA models.

Data Augmentation Denoising +4

Multi-level Contrastive Learning Framework for Sequential Recommendation

no code implementations27 Aug 2022 Ziyang Wang, Huoyu Liu, Wei Wei, Yue Hu, Xian-Ling Mao, Shaojian He, Rui Fang, Dangyang Chen

Different from the previous contrastive learning-based methods for SR, MCLSR learns the representations of users and items through a cross-view contrastive learning paradigm from four specific views at two different levels (i. e., interest- and feature-level).

Contrastive Learning Relation +1

ET5: A Novel End-to-end Framework for Conversational Machine Reading Comprehension

1 code implementation COLING 2022 Xiao Zhang, Heyan Huang, Zewen Chi, Xian-Ling Mao

Conversational machine reading comprehension (CMRC) aims to assist computers to understand an natural language text and thereafter engage in a multi-turn conversation to answer questions related to the text.

Decision Making Machine Reading Comprehension

Unsupervised Hashing with Semantic Concept Mining

1 code implementation23 Sep 2022 Rong-Cheng Tu, Xian-Ling Mao, Kevin Qinghong Lin, Chengfei Cai, Weize Qin, Hongfa Wang, Wei Wei, Heyan Huang

Recently, to improve the unsupervised image retrieval performance, plenty of unsupervised hashing methods have been proposed by designing a semantic similarity matrix, which is based on the similarities between image features extracted by a pre-trained CNN model.

Image Retrieval Prompt Engineering +4

Sequential Topic Selection Model with Latent Variable for Topic-Grounded Dialogue

no code implementations17 Oct 2022 Xiaofei Wen, Wei Wei, Xian-Ling Mao

To address the problem, in this paper we propose a novel approach, named Sequential Global Topic Attention (SGTA) to exploit topic transition over all conversations in a subtle way for better modeling post-to-response topic-transition and guiding the response generation to the current conversation.

Response Generation

HCL-TAT: A Hybrid Contrastive Learning Method for Few-shot Event Detection with Task-Adaptive Threshold

no code implementations17 Oct 2022 Ruihan Zhang, Wei Wei, Xian-Ling Mao, Rui Fang, Dangyang Chen

Conventional event detection models under supervised learning settings suffer from the inability of transfer to newly-emerged event types owing to lack of sufficient annotations.

Contrastive Learning Event Detection +2

STAGE: Span Tagging and Greedy Inference Scheme for Aspect Sentiment Triplet Extraction

1 code implementation28 Nov 2022 Shuo Liang, Wei Wei, Xian-Ling Mao, Yuanyuan Fu, Rui Fang, Dangyang Chen

Hence, we propose a novel approach, Span TAgging and Greedy infErence (STAGE), to extract sentiment triplets in span-level, where each span may consist of multiple words and play different roles simultaneously.

Aspect Sentiment Triplet Extraction Sentence +1

Momentum Decoding: Open-ended Text Generation As Graph Exploration

1 code implementation5 Dec 2022 Tian Lan, Yixuan Su, Shuhang Liu, Heyan Huang, Xian-Ling Mao

In this study, we formulate open-ended text generation from a new perspective, i. e., we view it as an exploration process within a directed graph.

Text Generation

Gated Mechanism Enhanced Multi-Task Learning for Dialog Routing

no code implementations COLING 2022 Ziming Huang, Zhuoxuan Jiang, Ke Wang, Juntao Li, Shanshan Feng, Xian-Ling Mao

Although most existing methods can fulfil this requirement, they can only model single-source dialog data and cannot effectively capture the underlying knowledge of relations among data and subtasks.

Multi-Task Learning

AttenWalker: Unsupervised Long-Document Question Answering via Attention-based Graph Walking

1 code implementation3 May 2023 Yuxiang Nie, Heyan Huang, Wei Wei, Xian-Ling Mao

To alleviate the problem, it might be possible to generate long-document QA pairs via unsupervised question answering (UQA) methods.

Few-Shot Learning Question Answering

Towards Hierarchical Policy Learning for Conversational Recommendation with Hypergraph-based Reinforcement Learning

1 code implementation4 May 2023 Sen Zhao, Wei Wei, Yifan Liu, Ziyang Wang, Wendi Li, Xian-Ling Mao, Shuai Zhu, Minghui Yang, Zujie Wen

Conversational recommendation systems (CRS) aim to timely and proactively acquire user dynamic preferred attributes through conversations for item recommendation.

Attribute Decision Making +2

Measuring Cross-Lingual Transferability of Multilingual Transformers on Sentence Classification

no code implementations15 May 2023 Zewen Chi, Heyan Huang, Xian-Ling Mao

Recent studies have exhibited remarkable capabilities of pre-trained multilingual Transformers, especially cross-lingual transferability.

Cross-Lingual Transfer Sentence +1

An Empirical Study on the Language Modal in Visual Question Answering

no code implementations17 May 2023 Daowan Peng, Wei Wei, Xian-Ling Mao, Yuanyuan Fu, Dangyang Chen

Generalization beyond in-domain experience to out-of-distribution data is of paramount significance in the AI domain.

Question Answering Visual Question Answering

SciMRC: Multi-perspective Scientific Machine Reading Comprehension

no code implementations25 Jun 2023 Xiao Zhang, Heqi Zheng, Yuxiang Nie, Heyan Huang, Xian-Ling Mao

However, the dataset has ignored the fact that different readers may have different levels of understanding of the text, and only includes single-perspective question-answer pairs, leading to a lack of consideration of different perspectives.

Machine Reading Comprehension

Copy Is All You Need

1 code implementation13 Jul 2023 Tian Lan, Deng Cai, Yan Wang, Heyan Huang, Xian-Ling Mao

The dominant text generation models compose the output by sequentially selecting words from a fixed vocabulary.

Domain Adaptation Language Modelling +1

TREA: Tree-Structure Reasoning Schema for Conversational Recommendation

1 code implementation20 Jul 2023 Wendi Li, Wei Wei, Xiaoye Qu, Xian-Ling Mao, Ye Yuan, Wenfeng Xie, Dangyang Chen

TREA constructs a multi-hierarchical scalable tree as the reasoning structure to clarify the causal relationships between mentioned entities, and fully utilizes historical conversations to generate more reasonable and suitable responses for recommended results.

Knowledge Graphs Recommendation Systems

Multi-view Hypergraph Contrastive Policy Learning for Conversational Recommendation

1 code implementation26 Jul 2023 Sen Zhao, Wei Wei, Xian-Ling Mao, Shuai Zhu, Minghui Yang, Zujie Wen, Dangyang Chen, Feida Zhu

Specifically, MHCPL timely chooses useful social information according to the interactive history and builds a dynamic hypergraph with three types of multiplex relations from different views.

Recommendation Systems

CriticBench: Evaluating Large Language Models as Critic

1 code implementation21 Feb 2024 Tian Lan, Wenwei Zhang, Chen Xu, Heyan Huang, Dahua Lin, Kai Chen, Xian-Ling Mao

Critique ability are crucial in the scalable oversight and self-improvement of Large Language Models (LLMs).

Mix-Initiative Response Generation with Dynamic Prefix Tuning

no code implementations26 Mar 2024 Yuxiang Nie, Heyan Huang, Xian-Ling Mao, Lizi Liao

Specifically, IDPT decouples initiative factors into different prefix parameters and uses the attention mechanism to adjust the selection of initiatives in guiding generation dynamically.

Response Generation

Cannot find the paper you are looking for? You can Submit a new open access paper.