Search Results for author: Bing Liu

Found 156 papers, 59 papers with code

Summarizing Behavioral Change Goals from SMS Exchanges to Support Health Coaches

no code implementations SIGDIAL (ACL) 2021 Itika Gupta, Barbara Di Eugenio, Brian D. Ziebart, Bing Liu, Ben S. Gerber, Lisa K. Sharp

In this paper, we present our work towards assisting health coaches by extracting the physical activity goal the user and coach negotiate via text messages.

Towards Enhancing Health Coaching Dialogue in Low-Resource Settings

no code implementations COLING 2022 Yue Zhou, Barbara Di Eugenio, Brian Ziebart, Lisa Sharp, Bing Liu, Ben Gerber, Nikolaos Agadakos, Shweta Yadav

In this paper, we propose to build a dialogue system that converses with the patients, helps them create and accomplish specific goals, and can address their emotions with empathy.

Empathetic Response Generation Response Generation

Human-Human Health Coaching via Text Messages: Corpus, Annotation, and Analysis

no code implementations SIGDIAL (ACL) 2020 Itika Gupta, Barbara Di Eugenio, Brian Ziebart, Aiswarya Baiju, Bing Liu, Ben Gerber, Lisa Sharp, Nadia Nabulsi, Mary Smart

In this paper, we discuss these schemas and briefly talk about their application for automatically extracting activity goals and annotating the second round of data, collected with different health coaches and patients.

Class Incremental Learning via Likelihood Ratio Based Task Prediction

1 code implementation26 Sep 2023 Haowei Lin, Yijia Shao, Weinan Qian, Ningxin Pan, Yiduo Guo, Bing Liu

An emerging theoretically justified and effective approach is to train a task-specific model for each task in a shared network for all tasks based on a task-incremental learning (TIL) method to deal with forgetting.

class-incremental learning Class Incremental Learning +1

Zero-shot information extraction from radiological reports using ChatGPT

no code implementations4 Sep 2023 Danqing Hu, Bing Liu, Xiaofeng Zhu, Xudong Lu, Nan Wu

Information extraction is the strategy to transform the sequence of characters into structured data, which can be employed for secondary analysis.

Language Modelling Large Language Model +3

CT-Net: Arbitrary-Shaped Text Detection via Contour Transformer

no code implementations25 Jul 2023 Zhiwen Shao, Yuchen Su, Yong Zhou, Fanrong Meng, Hancheng Zhu, Bing Liu, Rui Yao

Contour based scene text detection methods have rapidly developed recently, but still suffer from inaccurate frontend contour initialization, multi-stage error accumulation, or deficient local information aggregation.

Scene Text Detection Text Detection

Parameter-Level Soft-Masking for Continual Learning

1 code implementation26 Jun 2023 Tatsuya Konishi, Mori Kurokawa, Chihiro Ono, Zixuan Ke, Gyuhak Kim, Bing Liu

Although several techniques have achieved learning with no CF, they attain it by letting each task monopolize a sub-network in a shared network, which seriously limits knowledge transfer (KT) and causes over-consumption of the network capacity, i. e., as more tasks are learned, the performance deteriorates.

Continual Learning Incremental Learning +1

Class-Incremental Learning based on Label Generation

1 code implementation22 Jun 2023 Yijia Shao, Yiduo Guo, Dongyan Zhao, Bing Liu

Despite the great success of pre-trained language models, it is still a challenge to use these models for continual learning, especially for the class-incremental learning (CIL) setting due to catastrophic forgetting (CF).

class-incremental learning Class Incremental Learning +1

Dealing with Cross-Task Class Discrimination in Online Continual Learning

1 code implementation CVPR 2023 Yiduo Guo, Bing Liu, Dongyan Zhao

A novel optimization objective with a gradient-based adaptive method is proposed to dynamically deal with the problem in the online CL process.

class-incremental learning Class Incremental Learning +1

Sentiment Analysis in the Era of Large Language Models: A Reality Check

1 code implementation24 May 2023 Wenxuan Zhang, Yue Deng, Bing Liu, Sinno Jialin Pan, Lidong Bing

This paper aims to provide a comprehensive investigation into the capabilities of LLMs in performing various sentiment analysis tasks, from conventional sentiment classification to aspect-based sentiment analysis and multifaceted analysis of subjective texts.

Few-Shot Learning Sentiment Analysis +1

Do We Need an Encoder-Decoder to Model Dynamical Systems on Networks?

no code implementations20 May 2023 Bing Liu, Wei Luo, Gang Li, Jing Huang, Bo Yang

As deep learning gains popularity in modelling dynamical systems, we expose an underappreciated misunderstanding relevant to modelling dynamics on networks.

Time Series

Analyzing and Reducing the Performance Gap in Cross-Lingual Transfer with Fine-tuning Slow and Fast

no code implementations19 May 2023 Yiduo Guo, Yaobo Liang, Dongyan Zhao, Bing Liu, Duan Nan

Existing research has shown that a multilingual pre-trained language model fine-tuned with one (source) language also performs well on downstream tasks for non-source languages, even though no fine-tuning is done on these languages.

Cross-Lingual Transfer Language Modelling

A Unified Evaluation Framework for Novelty Detection and Accommodation in NLP with an Instantiation in Authorship Attribution

no code implementations8 May 2023 Neeraj Varshney, Himanshu Gupta, Eric Robertson, Bing Liu, Chitta Baral

To initiate a systematic research in this important area of 'dealing with novelties', we introduce 'NoveltyTask', a multi-stage task to evaluate a system's performance on pipelined novelty 'detection' and 'accommodation' tasks.

Novelty Detection

Open-World Continual Learning: Unifying Novelty Detection and Continual Learning

no code implementations20 Apr 2023 Gyuhak Kim, Changnan Xiao, Tatsuya Konishi, Zixuan Ke, Bing Liu

The key theoretical result is that regardless of whether WP and OOD detection (or TP) are defined explicitly or implicitly by a CIL algorithm, good WP and good OOD detection are necessary and sufficient conditions for good CIL, which unifies novelty or OOD detection and continual learning (CIL, in particular).

class-incremental learning Class Incremental Learning +3

Grab What You Need: Rethinking Complex Table Structure Recognition with Flexible Components Deliberation

no code implementations16 Mar 2023 Hao liu, Xin Li, Mingming Gong, Bing Liu, Yunfei Wu, Deqiang Jiang, Yinsong Liu, Xing Sun

Recently, Table Structure Recognition (TSR) task, aiming at identifying table structure into machine readable formats, has received increasing interest in the community.

Adapting a Language Model While Preserving its General Knowledge

2 code implementations21 Jan 2023 Zixuan Ke, Yijia Shao, Haowei Lin, Hu Xu, Lei Shu, Bing Liu

This paper shows that the existing methods are suboptimal and proposes a novel method to perform a more informed adaptation of the knowledge in the LM by (1) soft-masking the attention heads based on their importance to best preserve the general knowledge in the LM and (2) contrasting the representations of the general and the full (both general and domain knowledge) to learn an integrated representation with both general and domain-specific knowledge.

Continual Learning General Knowledge +1

Joint Spatio-Temporal Modeling for the Semantic Change Detection in Remote Sensing Images

1 code implementation10 Dec 2022 Lei Ding, Jing Zhang, Kai Zhang, Haitao Guo, Bing Liu, Lorenzo Bruzzone

Semantic Change Detection (SCD) refers to the task of simultaneously extracting the changed areas and the semantic categories (before and after the changes) in Remote Sensing Images (RSIs).

Change Detection

Dependency-aware Self-training for Entity Alignment

1 code implementation29 Nov 2022 Bing Liu, Tiancheng Lan, Wen Hua, Guido Zuccon

Entity Alignment (EA), which aims to detect entity mappings (i. e. equivalent entity pairs) in different Knowledge Graphs (KGs), is critical for KG fusion.

Entity Alignment Knowledge Graphs

Guiding Neural Entity Alignment with Compatibility

1 code implementation29 Nov 2022 Bing Liu, Harrisen Scells, Wen Hua, Guido Zuccon, Genghong Zhao, Xia Zhang

Making compatible predictions thus should be one of the goals of training an EA model along with fitting the labelled data: this aspect however is neglected in current methods.

Entity Alignment Knowledge Graphs

Continual Learning of Natural Language Processing Tasks: A Survey

1 code implementation23 Nov 2022 Zixuan Ke, Bing Liu

Continual learning (CL) is a learning paradigm that emulates the human capability of learning and accumulating knowledge continually without forgetting the previously learned knowledge and also transferring the learned knowledge to help learn new tasks better.

Continual Learning Transfer Learning

Lifelong and Continual Learning Dialogue Systems

no code implementations12 Nov 2022 Sahisnu Mazumder, Bing Liu

This book introduces the new paradigm of lifelong learning dialogue systems to endow chatbots the ability to learn continually by themselves through their own self-initiated interactions with their users and working environments to improve themselves.

Continual Learning

Semantic Novelty Detection and Characterization in Factual Text Involving Named Entities

1 code implementation31 Oct 2022 Nianzu Ma, Sahisnu Mazumder, Alexander Politowicz, Bing Liu, Eric Robertson, Scott Grigsby

Much of the existing work on text novelty detection has been studied at the topic level, i. e., identifying whether the topic of a document or a sentence is novel or not.

Novelty Detection

Knowledge-Guided Exploration in Deep Reinforcement Learning

no code implementations26 Oct 2022 Sahisnu Mazumder, Bing Liu, Shuai Wang, Yingxuan Zhu, Xiaotian Yin, Lifeng Liu, Jian Li

This paper proposes a new method to drastically speed up deep reinforcement learning (deep RL) training for problems that have the property of state-action permissibility (SAP).

reinforcement-learning Reinforcement Learning (RL)

Continual Training of Language Models for Few-Shot Learning

3 code implementations11 Oct 2022 Zixuan Ke, Haowei Lin, Yijia Shao, Hu Xu, Lei Shu, Bing Liu

Recent work on applying large language models (LMs) achieves impressive performance in many NLP applications.

Continual Learning Continual Pretraining +2

High-quality Task Division for Large-scale Entity Alignment

1 code implementation22 Aug 2022 Bing Liu, Wen Hua, Guido Zuccon, Genghong Zhao, Xia Zhang

To include in the EA subtasks a high proportion of the potential mappings originally present in the large EA task, we devise a counterpart discovery method that exploits the locality principle of the EA task and the power of trained EA models.

Entity Alignment Informativeness +1

A Multi-Head Model for Continual Learning via Out-of-Distribution Replay

3 code implementations20 Aug 2022 Gyuhak Kim, Zixuan Ke, Bing Liu

Instead of using the saved samples in memory to update the network for previous tasks/classes in the existing approach, MORE leverages the saved samples to build a task specific classifier (adding a new classification head) without updating the network learned for previous tasks/classes.

class-incremental learning Class Incremental Learning +2

TextDCT: Arbitrary-Shaped Text Detection via Discrete Cosine Transform Mask

no code implementations27 Jun 2022 Yuchen Su, Zhiwen Shao, Yong Zhou, Fanrong Meng, Hancheng Zhu, Bing Liu, Rui Yao

Arbitrary-shaped scene text detection is a challenging task due to the variety of text changes in font, size, color, and orientation.

Scene Text Detection Text Detection

Beyond Opinion Mining: Summarizing Opinions of Customer Reviews

1 code implementation3 Jun 2022 Reinald Kim Amplayo, Arthur Bražinskas, Yoshi Suhara, Xiaolan Wang, Bing Liu

In this tutorial, we present various aspects of opinion summarization that are useful for researchers and practitioners.

Opinion Mining Text Generation +1

Unsupervised Meta Learning With Multiview Constraints for Hyperspectral Image Small Sample set Classification

1 code implementation IEEE Transactions on Image Processing 2022 Kuiliang Gao, Bing Liu, Xuchu Yu, and Anzhu Yu

However, the existing methods based on meta learning still need to construct a labeled source data set with several pre-collected HSIs, and must utilize a large number of labeled samples for meta-training, which is actually time-consuming and labor-intensive.

Classification Meta-Learning +1

Open-set Recognition via Augmentation-based Similarity Learning

no code implementations24 Mar 2022 Sepideh Esmaeilpour, Lei Shu, Bing Liu

In many practical scenarios, this is not the case because there are unknowns or unseen class samples in the test data, which is called the open set scenario, and the unknowns need to be detected.

Open Set Learning

Continual Learning Based on OOD Detection and Task Masking

1 code implementation17 Mar 2022 Gyuhak Kim, Sepideh Esmaeilpour, Changnan Xiao, Bing Liu

Existing continual learning techniques focus on either task incremental learning (TIL) or class incremental learning (CIL) problem, but not both.

class-incremental learning Class Incremental Learning +2

AI Autonomy : Self-Initiated Open-World Continual Learning and Adaptation

no code implementations17 Mar 2022 Bing Liu, Sahisnu Mazumder, Eric Robertson, Scott Grigsby

As more and more AI agents are used in practice, it is time to think about how to make these agents fully autonomous so that they can (1) learn by themselves continually in a self-motivated and self-initiated manner rather than being retrained offline periodically on the initiation of human engineers and (2) accommodate or adapt to unexpected or novel circumstances.

Continual Learning

Ensemble Semi-supervised Entity Alignment via Cycle-teaching

1 code implementation12 Mar 2022 Kexuan Xin, Zequn Sun, Wen Hua, Bing Liu, Wei Hu, Jianfeng Qu, Xiaofang Zhou

We also design a conflict resolution mechanism to resolve the alignment conflict when combining the new alignment of an aligner and that from its teacher.

Entity Alignment Knowledge Graphs

Zero-Shot Aspect-Based Sentiment Analysis

no code implementations4 Feb 2022 Lei Shu, Hu Xu, Bing Liu, Jiahua Chen

Aspect-based sentiment analysis (ABSA) typically requires in-domain annotated data for supervised training/fine-tuning.

Aspect Extraction Natural Language Inference +1

Show, Deconfound and Tell: Image Captioning With Causal Inference

1 code implementation CVPR 2022 Bing Liu, Dong Wang, Xu Yang, Yong Zhou, Rui Yao, Zhiwen Shao, Jiaqi Zhao

In the encoding stage, the IOD is able to disentangle the region-based visual features by deconfounding the visual confounder.

Causal Inference Image Captioning

Continual Learning of a Mixed Sequence of Similar and Dissimilar Tasks

2 code implementations NeurIPS 2020 Zixuan Ke, Bing Liu, Xingchang Huang

To the best of our knowledge, no technique has been proposed to learn a sequence of mixed similar and dissimilar tasks that can deal with forgetting and also transfer knowledge forward and backward.

Continual Learning

Continual Learning with Knowledge Transfer for Sentiment Classification

2 code implementations18 Dec 2021 Zixuan Ke, Bing Liu, Hao Wang, Lei Shu

In this setting, the CL system learns a sequence of SC tasks incrementally in a neural network, where each task builds a classifier to classify the sentiment of reviews of a particular product category or domain.

Classification Continual Learning +4

CLASSIC: Continual and Contrastive Learning of Aspect Sentiment Classification Tasks

1 code implementation EMNLP 2021 Zixuan Ke, Bing Liu, Hu Xu, Lei Shu

The key novelty is a contrastive continual learning method that enables both knowledge transfer across tasks and knowledge distillation from old tasks to the new task, which eliminates the need for task ids in testing.

Classification Continual Learning +6

Achieving Forgetting Prevention and Knowledge Transfer in Continual Learning

1 code implementation NeurIPS 2021 Zixuan Ke, Bing Liu, Nianzu Ma, Hu Xu, Lei Shu

Although several papers have tried to deal with both CF and KT, our experiments show that they suffer from serious CF when the tasks do not have much shared knowledge.

Continual Learning Language Modelling +2

Neural Collaborative Graph Machines for Table Structure Recognition

no code implementations CVPR 2022 Hao liu, Xin Li, Bing Liu, Deqiang Jiang, Yinsong Liu, Bo Ren

We also show that the proposed NCGM can modulate collaborative pattern of different modalities conditioned on the context of intra-modality cues, which is vital for diversified table cases.

Table Recognition

Self-Initiated Open World Learning for Autonomous AI Agents

no code implementations21 Oct 2021 Bing Liu, Eric Robertson, Scott Grigsby, Sahisnu Mazumder

As more and more AI agents are used in practice, it is time to think about how to make these agents fully autonomous so that they can learn by themselves in a self-motivated and self-supervised manner rather than being retrained periodically on the initiation of human engineers using expanded training data.

ActiveEA: Active Learning for Neural Entity Alignment

1 code implementation EMNLP 2021 Bing Liu, Harrisen Scells, Guido Zuccon, Wen Hua, Genghong Zhao

Entity Alignment (EA) aims to match equivalent entities across different Knowledge Graphs (KGs) and is an essential step of KG fusion.

Active Learning Entity Alignment +1

Efficient Out-of-Distribution Detection via CVAE data Generation

no code implementations29 Sep 2021 Mengyu Wang, Yijia Shao, Haowei Lin, Wenpeng Hu, Bing Liu

Recently, contrastive loss with data augmentation and pseudo class creation has been shown to produce markedly better results for out-of-distribution (OOD) detection than previous methods.

Data Augmentation Out-of-Distribution Detection +1

Continual Learning Using Pseudo-Replay via Latent Space Sampling

no code implementations29 Sep 2021 Gyuhak Kim, Sepideh Esmaeilpour, Zixuan Ke, Tatsuya Konishi, Bing Liu

PLS is not only simple and efficient but also does not invade data privacy due to the fact that it works in the latent feature space.

class-incremental learning Class Incremental Learning +1

Partially Relaxed Masks for Lightweight Knowledge Transfer without Forgetting in Continual Learning

no code implementations29 Sep 2021 Tatsuya Konishi, Mori Kurokawa, Roberto Legaspi, Chihiro Ono, Zixuan Ke, Gyuhak Kim, Bing Liu

The goal of this work is to endow such systems with the additional ability to transfer knowledge among tasks when the tasks are similar and have shared knowledge to achieve higher accuracy.

Continual Learning Incremental Learning +1

Zero-Shot Dialogue State Tracking via Cross-Task Transfer

1 code implementation EMNLP 2021 Zhaojiang Lin, Bing Liu, Andrea Madotto, Seungwhan Moon, Paul Crook, Zhenpeng Zhou, Zhiguang Wang, Zhou Yu, Eunjoon Cho, Rajen Subba, Pascale Fung

Zero-shot transfer learning for dialogue state tracking (DST) enables us to handle a variety of task-oriented dialogue domains without the expense of collecting in-domain data.

Dialogue State Tracking Question Answering +1

Zero-Shot Out-of-Distribution Detection Based on the Pre-trained Model CLIP

1 code implementation6 Sep 2021 Sepideh Esmaeilpour, Bing Liu, Eric Robertson, Lei Shu

In an out-of-distribution (OOD) detection problem, samples of known classes(also called in-distribution classes) are used to train a special classifier.

Out-of-Distribution Detection Out of Distribution (OOD) Detection +2

Concept-Based Label Embedding via Dynamic Routing for Hierarchical Text Classification

1 code implementation ACL 2021 Xuepeng Wang, Li Zhao, Bing Liu, Tao Chen, Feng Zhang, Di Wang

In this paper, we propose a novel concept-based label embedding method that can explicitly represent the concept and model the sharing mechanism among classes for the hierarchical text classification.

text-classification Text Classification

Leveraging Slot Descriptions for Zero-Shot Cross-Domain Dialogue State Tracking

2 code implementations10 May 2021 Zhaojiang Lin, Bing Liu, Seungwhan Moon, Paul Crook, Zhenpeng Zhou, Zhiguang Wang, Zhou Yu, Andrea Madotto, Eunjoon Cho, Rajen Subba

Zero-shot cross-domain dialogue state tracking (DST) enables us to handle task-oriented dialogue in unseen domains without the expense of collecting in-domain data.

Dialogue State Tracking Transfer Learning

Learning to Dynamically Select Between Reward Shaping Signals

no code implementations1 Jan 2021 Alexander Politowicz, Bing Liu

Automatic reward shaping is one approach to solving this problem, using automatic identification and modulation of shaping reward signals that are more informative about how agents should behave in any given scenario to learn and adapt faster.

Reinforcement Learning (RL)

Continual Learning in Task-Oriented Dialogue Systems

1 code implementation EMNLP 2021 Andrea Madotto, Zhaojiang Lin, Zhenpeng Zhou, Seungwhan Moon, Paul Crook, Bing Liu, Zhou Yu, Eunjoon Cho, Zhiguang Wang

Continual learning in task-oriented dialogue systems can allow us to add new domains and functionalities through time without incurring the high cost of a whole system retraining.

Continual Learning Intent Recognition +3

A Deep Reinforcement Learning Approach for Ramp Metering Based on Traffic Video Data

no code implementations9 Dec 2020 Bing Liu, Yu Tang, Yuxiong Ji, Yu Shen, Yuchuan Du

Ramp metering that uses traffic signals to regulate vehicle flows from the on-ramps has been widely implemented to improve vehicle mobility of the freeway.

Reinforcement Learning (RL)

HRN: A Holistic Approach to One Class Learning

1 code implementation NeurIPS 2020 Wenpeng Hu, Mengyu Wang, Qi Qin, Jinwen Ma, Bing Liu

Existing neural network based one-class learning methods mainly use various forms of auto-encoders or GAN style adversarial training to learn a latent representation of the given one class of data.

Anomaly Detection Image Classification

Attention Aware Cost Volume Pyramid Based Multi-view Stereo Network for 3D Reconstruction

1 code implementation25 Nov 2020 Anzhu Yu, Wenyue Guo, Bing Liu, Xin Chen, Xin Wang, Xuefeng Cao, Bingchuan Jiang

This strategy estimates the depth map at coarsest level, while the depth maps at finer levels are considered as the upsampled depth map from previous level with pixel-wise depth residual.

3D Reconstruction

Lifelong Knowledge Learning in Rule-based Dialogue Systems

no code implementations19 Nov 2020 Bing Liu, Chuhe Mei

One of the main weaknesses of current chatbots or dialogue systems is that they do not learn online during conversations after they are deployed.


MP-ResNet: Multi-path Residual Network for the Semantic segmentation of High-Resolution PolSAR Images

1 code implementation10 Nov 2020 Lei Ding, Kai Zheng, Dong Lin, Yuxing Chen, Bing Liu, Jiansheng Li, Lorenzo Bruzzone

This CNN architecture can be used as a baseline method for future studies on the semantic segmentation of PolSAR images.

Semantic Segmentation

Using the Past Knowledge to Improve Sentiment Classification

no code implementations Findings of the Association for Computational Linguistics 2020 Qi Qin, Wenpeng Hu, Bing Liu

It proposes a new lifelong learning model (called L2PG) that can retain and selectively transfer the knowledge learned in the past to help learn the new task.

Classification Knowledge Distillation +2

Understanding Pre-trained BERT for Aspect-based Sentiment Analysis

2 code implementations COLING 2020 Hu Xu, Lei Shu, Philip S. Yu, Bing Liu

Most features in the representation of an aspect are dedicated to the fine-grained semantics of the domain (or product category) and the aspect itself, instead of carrying summarized opinions from its context.

Aspect-Based Sentiment Analysis (ABSA) Language Modelling +1

Adding Chit-Chat to Enhance Task-Oriented Dialogues

1 code implementation NAACL 2021 Kai Sun, Seungwhan Moon, Paul Crook, Stephen Roller, Becka Silvert, Bing Liu, Zhiguang Wang, Honglei Liu, Eunjoon Cho, Claire Cardie

Existing dialogue corpora and models are typically designed under two disjoint motives: while task-oriented systems focus on achieving functional goals (e. g., booking hotels), open-domain chatbots aim at making socially engaging conversations.

Dialogue Generation Dialogue Understanding +1

NUANCED: Natural Utterance Annotation for Nuanced Conversation with Estimated Distributions

1 code implementation Findings (EMNLP) 2021 Zhiyu Chen, Honglei Liu, Hu Xu, Seungwhan Moon, Hao Zhou, Bing Liu

As there is no clean mapping for a user's free form utterance to an ontology, we first model the user preferences as estimated distributions over the system ontology and map the users' utterances to such distributions.

Dialogue State Tracking

A Knowledge-Driven Approach to Classifying Object and Attribute Coreferences in Opinion Mining

no code implementations Findings of the Association for Computational Linguistics 2020 Jiahua Chen, Shuai Wang, Sahisnu Mazumder, Bing Liu

Classifying and resolving coreferences of objects (e. g., product names) and attributes (e. g., product aspects) in opinionated reviews is crucial for improving the opinion mining performance.

Opinion Mining

Text Classification with Novelty Detection

no code implementations23 Sep 2020 Qi Qin, Wenpeng Hu, Bing Liu

In this paper, we propose a significantly more effective approach that converts the original problem to a pair-wise matching problem and then outputs how probable two instances belong to the same class.

General Classification Novelty Detection +2

Lifelong Learning Dialogue Systems: Chatbots that Self-Learn On the Job

no code implementations22 Sep 2020 Bing Liu, Sahisnu Mazumder

Due to the huge amount of manual effort involved, they are difficult to scale and also tend to produce many errors ought to their limited ability to understand natural language and the limited knowledge in their KBs.

Feature Projection for Improved Text Classification

no code implementations ACL 2020 Qi Qin, Wenpeng Hu, Bing Liu

In this paper, we propose a novel angle to further improve this representation learning, i. e., feature projection.

General Classification Representation Learning +4

User Memory Reasoning for Conversational Recommendation

no code implementations COLING 2020 Hu Xu, Seungwhan Moon, Honglei Liu, Pararth Shah, Bing Liu, Philip S. Yu

We study a conversational recommendation model which dynamically manages users' past (offline) preferences and current (online) requests through a structured and cumulative user memory knowledge graph, to allow for natural interactions and accurate recommendations.

DomBERT: Domain-oriented Language Model for Aspect-based Sentiment Analysis

1 code implementation Findings of the Association for Computational Linguistics 2020 Hu Xu, Bing Liu, Lei Shu, Philip S. Yu

This paper focuses on learning domain-oriented language models driven by end tasks, which aims to combine the worlds of both general-purpose language models (such as ELMo and BERT) and domain-specific language understanding.

Aspect-Based Sentiment Analysis (ABSA) Language Modelling

Computational Performance of a Germline Variant Calling Pipeline for Next Generation Sequencing

no code implementations1 Apr 2020 Jie Liu, Xiaotian Wu, Kai Zhang, Bing Liu, Renyi Bao, Xiao Chen, Yiran Cai, Yiming Shen, Xinjun He, Jun Yan, Weixing Ji

With the booming of next generation sequencing technology and its implementation in clinical practice and life science research, the need for faster and more efficient data analysis methods becomes pressing in the field of sequencing.

A Failure of Aspect Sentiment Classifiers and an Adaptive Re-weighting Solution

1 code implementation4 Nov 2019 Hu Xu, Bing Liu, Lei Shu, Philip S. Yu

Aspect-based sentiment classification (ASC) is an important task in fine-grained sentiment analysis.~Deep supervised ASC approaches typically model this task as a pair-wise classification task that takes an aspect and a sentence containing the aspect and outputs the polarity of the aspect in that sentence.

General Classification Sentiment Analysis +1

Building an Application Independent Natural Language Interface

no code implementations30 Oct 2019 Sahisnu Mazumder, Bing Liu, Shuai Wang, Sepideh Esmaeilpour

Traditional approaches to building natural language (NL) interfaces typically use a semantic parser to parse the user command and convert it to a logical form, which is then translated to an executable action in an application.

Analyzing the Forgetting Problem in the Pretrain-Finetuning of Dialogue Response Models

no code implementations16 Oct 2019 Tianxing He, Jun Liu, Kyunghyun Cho, Myle Ott, Bing Liu, James Glass, Fuchun Peng

We find that mix-review effectively regularizes the finetuning process, and the forgetting problem is alleviated to some extent.

Response Generation Text Generation +1

Continual Learning via Principal Components Projection

no code implementations25 Sep 2019 Gyuhak Kim, Bing Liu

The idea is that in learning a new task, if we can ensure that the gradient updates will only occur in the orthogonal directions to the input vectors of the previous tasks, then the weight updates for learning the new task will not affect the previous tasks.

Continual Learning

Learning from Positive and Unlabeled Data with Adversarial Training

no code implementations25 Sep 2019 Wenpeng Hu, Ran Le, Bing Liu, Feng Ji, Haiqing Chen, Dongyan Zhao, Jinwen Ma, Rui Yan

Positive-unlabeled (PU) learning learns a binary classifier using only positive and unlabeled examples without labeled negative examples.

Learning with Noisy Labels for Sentence-level Sentiment Classification

no code implementations IJCNLP 2019 Hao Wang, Bing Liu, Chaozhuo Li, Yan Yang, Tianrui Li

We propose a novel DNN model called NetAb (as shorthand for convolutional neural Networks with Ab-networks) to handle noisy labels during training.

Classification General Classification +3

Modeling Multi-Action Policy for Task-Oriented Dialogues

1 code implementation IJCNLP 2019 Lei Shu, Hu Xu, Bing Liu, Piero Molino

Dialogue management (DM) plays a key role in the quality of the interaction with the user in a task-oriented dialogue system.

Dialogue Management Management

Flexibly-Structured Model for Task-Oriented Dialogues

1 code implementation WS 2019 Lei Shu, Piero Molino, Mahdi Namazifar, Hu Xu, Bing Liu, Huaixiu Zheng, Gokhan Tur

It is based on a simple and practical yet very effective sequence-to-sequence approach, where language understanding and state tracking tasks are modeled jointly with a structured copy-augmented sequential decoder and a multi-label decoder for each slot.

Task-Oriented Dialogue Systems Text Generation

Lifelong and Interactive Learning of Factual Knowledge in Dialogues

no code implementations WS 2019 Sahisnu Mazumder, Bing Liu, Shuai Wang, Nianzu Ma

Dialogue systems are increasingly using knowledge bases (KBs) storing real-world facts to help generate quality responses.

Forward and Backward Knowledge Transfer for Sentiment Classification

no code implementations8 Jun 2019 Hao Wang, Bing Liu, Shuai Wang, Nianzu Ma, Yan Yang

That is, it is possible to improve the NB classifier for a task by improving its model parameters directly by using the retained knowledge from other tasks.

Classification General Classification +3

DOER: Dual Cross-Shared RNN for Aspect Term-Polarity Co-Extraction

1 code implementation ACL 2019 Huaishao Luo, Tianrui Li, Bing Liu, Junbo Zhang

This paper focuses on two related subtasks of aspect-based sentiment analysis, namely aspect term extraction and aspect sentiment classification, which we call aspect term-polarity co-extraction.

Aspect-Based Sentiment Analysis (ABSA) Sentiment Classification +1

GSN: A Graph-Structured Network for Multi-Party Dialogues

1 code implementation31 May 2019 Wenpeng Hu, Zhangming Chan, Bing Liu, Dongyan Zhao, Jinwen Ma, Rui Yan

Existing neural models for dialogue response generation assume that utterances are sequentially organized.

Response Generation

Spectral Perturbation Meets Incomplete Multi-view Data

no code implementations31 May 2019 Hao Wang, Linlin Zong, Bing Liu, Yan Yang, Wei Zhou

In this work, we show a strong link between perturbation risk bounds and incomplete multi-view clustering.

Clustering Incomplete multi-view clustering +1

BERT Post-Training for Review Reading Comprehension and Aspect-based Sentiment Analysis

1 code implementation NAACL 2019 Hu Xu, Bing Liu, Lei Shu, Philip S. Yu

Since ReviewRC has limited training examples for RRC (and also for aspect-based sentiment analysis), we then explore a novel post-training approach on the popular language model BERT to enhance the performance of fine-tuning of BERT for RRC.

Aspect Extraction Sentiment Classification

Review Conversational Reading Comprehension

1 code implementation3 Feb 2019 Hu Xu, Bing Liu, Lei Shu, Philip S. Yu

Inspired by conversational reading comprehension (CRC), this paper studies a novel task of leveraging reviews as a source to build an agent that can answer multi-turn questions from potential consumers of online businesses.

Language Modelling Machine Reading Comprehension

Guided Exploration in Deep Reinforcement Learning

no code implementations27 Sep 2018 Sahisnu Mazumder, Bing Liu, Shuai Wang, Yingxuan Zhu, Xiaotian Yin, Lifeng Liu, Jian Li, Yongbing Huang

This paper proposes a new method to drastically speed up deep reinforcement learning (deep RL) training for problems that have the property of \textit{state-action permissibility} (SAP).

reinforcement-learning Reinforcement Learning (RL)

Open-world Learning and Application to Product Classification

1 code implementation17 Sep 2018 Hu Xu, Bing Liu, Lei Shu, P. Yu

Classic supervised learning makes the closed-world assumption, meaning that classes seen in testing must have been seen in training.

Classification General Classification +1

An Attribute Enhanced Domain Adaptive Model for Cold-Start Spam Review Detection

no code implementations COLING 2018 Zhenni You, Tieyun Qian, Bing Liu

With the abundant attributes in existing entities and knowledge in other domains, we successfully solve the problem of data scarcity in the cold-start settings.

Spam detection

Bootstrapping a Neural Conversational Agent with Dialogue Self-Play, Crowdsourcing and On-Line Reinforcement Learning

no code implementations NAACL 2018 Pararth Shah, Dilek Hakkani-T{\"u}r, Bing Liu, Gokhan T{\"u}r

End-to-end neural models show great promise towards building conversational agents that are trained from data and on-line experience using supervised and reinforcement learning.

Reinforcement Learning (RL)

End-to-End Learning of Task-Oriented Dialogs

no code implementations NAACL 2018 Bing Liu, Ian Lane

In this thesis proposal, we address the limitations of conventional pipeline design of task-oriented dialog systems and propose end-to-end learning solutions.

Multi-Task Learning Spoken Language Understanding

Adversarial Learning of Task-Oriented Neural Dialog Models

no code implementations WS 2018 Bing Liu, Ian Lane

We further discuss the covariate shift problem in online adversarial dialog learning and show how we can address that with partial access to user feedback.

Dialog Learning Reinforcement Learning (RL)

Lifelong Domain Word Embedding via Meta-Learning

1 code implementation25 May 2018 Hu Xu, Bing Liu, Lei Shu, Philip S. Yu

Learning high-quality domain word embeddings is important for achieving good performance in many NLP tasks.

Meta-Learning Word Embeddings

Improving Aspect Term Extraction with Bidirectional Dependency Tree Representation

1 code implementation21 May 2018 Huaishao Luo, Tianrui Li, Bing Liu, Bin Wang, Herwig Unger

The key idea is to explicitly incorporate both representations gained separately from the bottom-up and top-down propagation on the given dependency syntactic tree.

Aspect-Based Sentiment Analysis (ABSA) Term Extraction

Double Embeddings and CNN-based Sequence Labeling for Aspect Extraction

2 code implementations ACL 2018 Hu Xu, Bing Liu, Lei Shu, Philip S. Yu

Unlike other highly sophisticated supervised deep learning models, this paper proposes a novel and yet simple CNN model employing two types of pre-trained embeddings for aspect extraction: general-purpose embeddings and domain-specific embeddings.

Aspect Extraction

Dialogue Learning with Human Teaching and Feedback in End-to-End Trainable Task-Oriented Dialogue Systems

1 code implementation NAACL 2018 Bing Liu, Gokhan Tur, Dilek Hakkani-Tur, Pararth Shah, Larry Heck

To address this challenge, we propose a hybrid imitation and reinforcement learning method, with which a dialogue agent can effectively learn from its interaction with users by learning from human teaching and feedback.

Dialogue State Tracking Imitation Learning +3

Towards a Continuous Knowledge Learning Engine for Chatbots

no code implementations16 Feb 2018 Sahisnu Mazumder, Nianzu Ma, Bing Liu

We model the task as an open-world knowledge base completion problem and propose a novel technique called lifelong interactive learning and inference (LiLi) to solve it.

General Knowledge Knowledge Base Completion

Deep Learning for Sentiment Analysis : A Survey

1 code implementation24 Jan 2018 Lei Zhang, Shuai Wang, Bing Liu

Deep learning has emerged as a powerful machine learning technique that learns multiple layers of representations or features of the data and produces state-of-the-art prediction results.

BIG-bench Machine Learning Sentiment Analysis

Contextual and Position-Aware Factorization Machines for Sentiment Classification

no code implementations18 Jan 2018 Shuai Wang, Mianwei Zhou, Geli Fei, Yi Chang, Bing Liu

While existing machine learning models have achieved great success for sentiment classification, they typically do not explicitly capture sentiment-oriented word interaction, which can lead to poor results for fine-grained analysis at the snippet level (a phrase or sentence).

Classification General Classification +4

Unseen Class Discovery in Open-world Classification

1 code implementation ICLR 2018 Lei Shu, Hu Xu, Bing Liu

It is reasonable to assume that this knowledge can be transferred to the rejected examples and used to discover the hidden unseen classes in them.

Classification Clustering +1

Lifelong Learning for Sentiment Classification

no code implementations IJCNLP 2015 Zhiyuan Chen, Nianzu Ma, Bing Liu

This paper proposes a novel lifelong learning (LL) approach to sentiment classification.

Bayesian Optimization Classification +3

Lifelong Word Embedding via Meta-Learning

no code implementations ICLR 2018 Hu Xu, Bing Liu, Lei Shu, Philip S. Yu

We observe that domains are not isolated and a small domain corpus can leverage the learned knowledge from many past domains to augment that corpus in order to generate high-quality embeddings.

Meta-Learning Word Embeddings

Topic-Based Question Generation

no code implementations ICLR 2018 Wenpeng Hu, Bing Liu, Rui Yan, Dongyan Zhao, Jinwen Ma

In the paper, we propose a new question generation problem, which also requires the input of a target topic in addition to a piece of descriptive text.

Chatbot Descriptive +3

Context-aware Path Ranking for Knowledge Base Completion

no code implementations20 Dec 2017 Sahisnu Mazumder, Bing Liu

PR algorithms enumerate paths between entity pairs in a KB and use those paths as features to train a model for missing fact prediction.

Knowledge Base Completion

Multi-Domain Adversarial Learning for Slot Filling in Spoken Language Understanding

no code implementations30 Nov 2017 Bing Liu, Ian Lane

Model that produces such shared representations can be combined with models trained on individual domain SLU data to reduce the amount of training samples required for developing a new domain.

slot-filling Slot Filling +1

End-to-End Optimization of Task-Oriented Dialogue Model with Deep Reinforcement Learning

no code implementations29 Nov 2017 Bing Liu, Gokhan Tur, Dilek Hakkani-Tur, Pararth Shah, Larry Heck

We show that deep RL based optimization leads to significant improvement on task success rate and reduction in dialogue length comparing to supervised training model.

reinforcement-learning Reinforcement Learning (RL)

Customized Nonlinear Bandits for Online Response Selection in Neural Conversation Models

no code implementations22 Nov 2017 Bing Liu, Tong Yu, Ian Lane, Ole J. Mengshoel

Moreover, we report encouraging response selection performance of the proposed neural bandit model using the Recall@k metric for a small set of online training samples.

Multi-Armed Bandits Response Generation +2

Reconstruction of Hidden Representation for Robust Feature Extraction

no code implementations8 Oct 2017 Zeng Yu, Tianrui Li, Ning Yu, Yi Pan, Hongmei Chen, Bing Liu

We believe that minimizing the reconstruction error of the hidden representation is more robust than minimizing the Frobenius norm of the Jacobian matrix of the hidden representation.

Denoising Representation Learning

DOC: Deep Open Classification of Text Documents

no code implementations EMNLP 2017 Lei Shu, Hu Xu, Bing Liu

As learning is used increasingly in dynamic open environments where some new/test documents may not belong to any of the training classes, identifying these novel documents during classification presents an important problem.

General Classification text-classification +1

Iterative Policy Learning in End-to-End Trainable Task-Oriented Neural Dialog Models

no code implementations18 Sep 2017 Bing Liu, Ian Lane

In this paper, we present a deep reinforcement learning (RL) framework for iterative dialog policy optimization in end-to-end task-oriented dialog systems.

Reinforcement Learning (RL)

An End-to-End Trainable Neural Network Model with Belief Tracking for Task-Oriented Dialog

no code implementations20 Aug 2017 Bing Liu, Ian Lane

We present a novel end-to-end trainable neural network model for task-oriented dialog systems.

dialog state tracking

Emotional Chatting Machine: Emotional Conversation Generation with Internal and External Memory

6 code implementations4 Apr 2017 Hao Zhou, Minlie Huang, Tianyang Zhang, Xiaoyan Zhu, Bing Liu

Perception and expression of emotion are key factors to the success of dialogue systems or conversational agents.

Dialog Context Language Modeling with Recurrent Neural Networks

no code implementations15 Jan 2017 Bing Liu, Ian Lane

In this work, we propose contextual language models that incorporate dialog level discourse information into language modeling.

Language Modelling

Supervised Opinion Aspect Extraction by Exploiting Past Extraction Results

no code implementations23 Dec 2016 Lei Shu, Bing Liu, Hu Xu, Annice Kim

When "screen" appears in a review of a new domain (or product), it is likely to be an aspect too.

Aspect Extraction Sentiment Analysis

Joint Online Spoken Language Understanding and Language Modeling with Recurrent Neural Networks

no code implementations WS 2016 Bing Liu, Ian Lane

On SLU tasks, our joint model outperforms the independent task training model by 22. 3% on intent detection error rate, with slight degradation on slot filling F1 score.

Benchmarking Intent Detection +4

Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling

6 code implementations6 Sep 2016 Bing Liu, Ian Lane

Attention-based encoder-decoder neural network models have recently shown promising results in machine translation and speech recognition.

intent-classification Intent Classification +3

Cannot find the paper you are looking for? You can Submit a new open access paper.