Search Results for author: Tianyu Liu

Found 43 papers, 19 papers with code

Semi-supervised Implicit Scene Completion from Sparse LiDAR

1 code implementation29 Nov 2021 Pengfei Li, Yongliang Shi, Tianyu Liu, Hao Zhao, Guyue Zhou, Ya-Qin Zhang

Recent advances show that semi-supervised implicit representation learning can be achieved through physical constraints like Eikonal equations.

Representation Learning

Cerberus Transformer: Joint Semantic, Affordance and Attribute Parsing

1 code implementation24 Nov 2021 Xiaoxue Chen, Tianyu Liu, Hao Zhao, Guyue Zhou, Ya-Qin Zhang

Multi-task indoor scene understanding is widely considered as an intriguing formulation, as the affinity of different tasks may lead to improved performance.

indoor scene understanding Scene Understanding

Hierarchical Curriculum Learning for AMR Parsing

no code implementations15 Oct 2021 Peiyi Wang, Liang Chen, Tianyu Liu, Baobao Chang, Zhifang Sui

Abstract Meaning Representation (AMR) parsing translates sentences to the semantic representation with a hierarchical structure, which is recently empowered by pretrained encoder-decoder models.

AMR Parsing Curriculum Learning +2

An Enhanced Span-based Decomposition Method for Few-Shot Sequence Labeling

no code implementations27 Sep 2021 Peiyi Wang, Runxin Xu, Tianyu Liu, Qingyu Zhou, Yunbo Cao, Baobao Chang, Zhifang Sui

ESD outperforms previous methods in two popular FSSL benchmarks, FewNERD and SNIPS, and is proven to be more robust in the nested and noisy tagging scenarios.

Meta-Learning

Behind the Scenes: An Exploration of Trigger Biases Problem in Few-Shot Event Classification

1 code implementation29 Aug 2021 Peiyi Wang, Runxin Xu, Tianyu Liu, Damai Dai, Baobao Chang, Zhifang Sui

However, we find they suffer from trigger biases that signify the statistical homogeneity between some trigger words and target event types, which we summarize as trigger overlapping and trigger separability.

Explicit Interaction Network for Aspect Sentiment Triplet Extraction

no code implementations21 Jun 2021 Peiyi Wang, Tianyu Liu, Damai Dai, Runxin Xu, Baobao Chang, Zhifang Sui

Table encoder extracts sentiment at token-pair level, so that the compositional feature between targets and opinions can be easily captured.

Aspect Sentiment Triplet Extraction

Decompose, Fuse and Generate: A Formation-Informed Method for Chinese Definition Generation

no code implementations NAACL 2021 Hua Zheng, Damai Dai, Lei LI, Tianyu Liu, Zhifang Sui, Baobao Chang, Yang Liu

In this paper, we tackle the task of Definition Generation (DG) in Chinese, which aims at automatically generating a definition for a word.

Document-level Event Extraction via Heterogeneous Graph-based Interaction Model with a Tracker

1 code implementation ACL 2021 Runxin Xu, Tianyu Liu, Lei LI, Baobao Chang

Existing methods are not effective due to two challenges of this task: a) the target event arguments are scattered across sentences; b) the correlation among events in a document is non-trivial to model.

Document-level Document-level Event Extraction +1

Premise-based Multimodal Reasoning: A Human-like Cognitive Process

no code implementations15 May 2021 Qingxiu Dong, Ziwei Qin, Heming Xia, Tian Feng, Shoujie Tong, Haoran Meng, Lin Xu, Tianyu Liu, Zuifang Sui, Weidong Zhan, Sujian Li, Zhongyu Wei

Reasoning is one of the major challenges of Human-like AI and has recently attracted intensive attention from natural language processing (NLP) researchers.

An Intelligent Model for Solving Manpower Scheduling Problems

1 code implementation7 May 2021 Lingyu Zhang, Tianyu Liu, Yunhai Wang

In addition, to the numerical solution of the manpower scheduling problem, this paper also studies the algorithm for scheduling task list generation and the method of displaying scheduling results.

Apply Artificial Neural Network to Solving Manpower Scheduling Problem

1 code implementation7 May 2021 Tianyu Liu, Lingyu Zhang

This paper proposes a new model combined with deep learning to solve the multi-shift manpower scheduling problem based on the existing research.

Time Series

Automatic Learning to Detect Concept Drift

no code implementations4 May 2021 Hang Yu, Tianyu Liu, Jie Lu, Guangquan Zhang

Many methods have been proposed to detect concept drift, i. e., the change in the distribution of streaming data, due to concept drift causes a decrease in the prediction accuracy of algorithms.

Active Learning Meta-Learning

A Token-level Reference-free Hallucination Detection Benchmark for Free-form Text Generation

1 code implementation18 Apr 2021 Tianyu Liu, Yizhe Zhang, Chris Brockett, Yi Mao, Zhifang Sui, Weizhu Chen, Bill Dolan

Large pretrained generative models like GPT-3 often suffer from hallucinating non-existent or incorrect content, which undermines their potential merits in real applications.

Document-level Text Generation

Integrating Fast Regional Optimization into Sampling-based Kinodynamic Planning for Multirotor Flight

1 code implementation9 Mar 2021 Hongkai Ye, Tianyu Liu, Chao Xu, Fei Gao

For real-time multirotor kinodynamic motion planning, the efficiency of sampling-based methods is usually hindered by difficult-to-sample homotopy classes like narrow passages.

Motion Planning Robotics

Application of the unified control and detection framework to detecting stealthy integrity cyber-attacks on feedback control systems

no code implementations27 Feb 2021 Steven X. Ding, Linlin Li, Dong Zhao, Chris Louen, Tianyu Liu

It is demonstrated, in the unified framework of control and detection, that all kernel attacks can be structurally detected when not only the observer-based residual, but also the control signal based residual signals are generated and used for the detection purpose.

Towards Faithfulness in Open Domain Table-to-text Generation from an Entity-centric View

1 code implementation17 Feb 2021 Tianyu Liu, Xin Zheng, Baobao Chang, Zhifang Sui

In open domain table-to-text generation, we notice that the unfaithful generation usually contains hallucinated content which can not be aligned to any input table record.

Few-Shot Learning Table-to-Text Generation

First Target and Opinion then Polarity: Enhancing Target-opinion Correlation for Aspect Sentiment Triplet Extraction

no code implementations17 Feb 2021 Lianzhe Huang, Peiyi Wang, Sujian Li, Tianyu Liu, Xiaodong Zhang, Zhicong Cheng, Dawei Yin, Houfeng Wang

Aspect Sentiment Triplet Extraction (ASTE) aims to extract triplets from a sentence, including target entities, associated sentiment polarities, and opinion spans which rationalize the polarities.

Aspect Sentiment Triplet Extraction

PAC-Bayes Bounds for Meta-learning with Data-Dependent Prior

1 code implementation7 Feb 2021 Tianyu Liu, Jie Lu, Zheng Yan, Guangquan Zhang

By leveraging experience from previous tasks, meta-learning algorithms can achieve effective fast adaptation ability when encountering new tasks.

Meta-Learning

An Anchor-Based Automatic Evaluation Metric for Document Summarization

no code implementations COLING 2020 Kexiang Wang, Tianyu Liu, Baobao Chang, Zhifang Sui

The widespread adoption of reference-based automatic evaluation metrics such as ROUGE has promoted the development of document summarization.

Document Summarization

An Empirical Study on Model-agnostic Debiasing Strategies for Robust Natural Language Inference

1 code implementation CONLL 2020 Tianyu Liu, Xin Zheng, Xiaoan Ding, Baobao Chang, Zhifang Sui

The prior work on natural language inference (NLI) debiasing mainly targets at one or few known biases while not necessarily making the models more robust.

Data Augmentation Natural Language Inference

Discriminatively-Tuned Generative Classifiers for Robust Natural Language Inference

1 code implementation EMNLP 2020 Xiaoan Ding, Tianyu Liu, Baobao Chang, Zhifang Sui, Kevin Gimpel

We explore training objectives for discriminative fine-tuning of our generative classifiers, showing improvements over log loss fine-tuning from prior work .

Fine-tuning Natural Language Inference

An Exploration of Arbitrary-Order Sequence Labeling via Energy-Based Inference Networks

1 code implementation EMNLP 2020 Lifu Tu, Tianyu Liu, Kevin Gimpel

Many tasks in natural language processing involve predicting structured outputs, e. g., sequence labeling, semantic role labeling, parsing, and machine translation.

Machine Translation Representation Learning +2

HypoNLI: Exploring the Artificial Patterns of Hypothesis-only Bias in Natural Language Inference

no code implementations LREC 2020 Tianyu Liu, Xin Zheng, Baobao Chang, Zhifang Sui

Many recent studies have shown that for models trained on datasets for natural language inference (NLI), it is possible to make correct predictions by merely looking at the hypothesis while completely ignoring the premise.

Natural Language Inference

Table-to-Text Natural Language Generation with Unseen Schemas

no code implementations9 Nov 2019 Tianyu Liu, Wei Wei, William Yang Wang

In this paper, we propose the new task of table-to-text NLG with unseen schemas, which specifically aims to test the generalization of NLG for input tables with attribute types that never appear during training.

Text Generation

MAAM: A Morphology-Aware Alignment Model for Unsupervised Bilingual Lexicon Induction

no code implementations ACL 2019 Pengcheng Yang, Fuli Luo, Peng Chen, Tianyu Liu, Xu sun

The task of unsupervised bilingual lexicon induction (UBLI) aims to induce word translations from monolingual corpora in two languages.

Bilingual Lexicon Induction Denoising +2

Learning to Control the Fine-grained Sentiment for Story Ending Generation

no code implementations ACL 2019 Fuli Luo, Damai Dai, Pengcheng Yang, Tianyu Liu, Baobao Chang, Zhifang Sui, Xu sun

Therefore, we propose a generic and novel framework which consists of a sentiment analyzer and a sentimental generator, respectively addressing the two challenges.

Text Generation

Towards Comprehensive Description Generation from Factual Attribute-value Tables

no code implementations ACL 2019 Tianyu Liu, Fuli Luo, Pengcheng Yang, Wei Wu, Baobao Chang, Zhifang Sui

To relieve these problems, we first propose force attention (FA) method to encourage the generator to pay more attention to the uncovered attributes to avoid potential key attributes missing.

Towards Improving Neural Named Entity Recognition with Gazetteers

1 code implementation ACL 2019 Tianyu Liu, Jin-Ge Yao, Chin-Yew Lin

Most of the recently proposed neural models for named entity recognition have been purely data-driven, with a strong emphasis on getting rid of the efforts for collecting external resources or designing hand-crafted features.

Ranked #8 on Named Entity Recognition on Ontonotes v5 (English) (using extra training data)

Named Entity Recognition NER

Enhancing Topic-to-Essay Generation with External Commonsense Knowledge

no code implementations ACL 2019 Pengcheng Yang, Lei LI, Fuli Luo, Tianyu Liu, Xu sun

Experiments show that with external commonsense knowledge and adversarial training, the generated essays are more novel, diverse, and topic-consistent than existing methods in terms of both automatic and human evaluation.

Concept-To-Text Generation

A Novel Dual-Lidar Calibration Algorithm Using Planar Surfaces

no code implementations27 Apr 2019 Jianhao Jiao, Qinghai Liao, Yilong Zhu, Tianyu Liu, Yang Yu, Rui Fan, Lujia Wang, Ming Liu

Multiple lidars are prevalently used on mobile vehicles for rendering a broad view to enhance the performance of localization and perception systems.

Translation

Phrase-level Self-Attention Networks for Universal Sentence Encoding

no code implementations EMNLP 2018 Wei Wu, Houfeng Wang, Tianyu Liu, Shuming Ma

As a result, the memory consumption can be reduced because the self-attention is performed at the phrase level instead of the sentence level.

Multi-class Classification Natural Language Inference +3

Incorporating Glosses into Neural Word Sense Disambiguation

1 code implementation ACL 2018 Fuli Luo, Tianyu Liu, Qiaolin Xia, Baobao Chang, Zhifang Sui

GAS models the semantic relationship between the context and the gloss in an improved memory network framework, which breaks the barriers of the previous supervised methods and knowledge-based methods.

Word Sense Disambiguation

Table-to-text Generation by Structure-aware Seq2seq Learning

3 code implementations27 Nov 2017 Tianyu Liu, Kexiang Wang, Lei Sha, Baobao Chang, Zhifang Sui

In the decoding phase, dual attention mechanism which contains word level attention and field level attention is proposed to model the semantic relevance between the generated description and the table.

Table-to-Text Generation

A Soft-label Method for Noise-tolerant Distantly Supervised Relation Extraction

no code implementations EMNLP 2017 Tianyu Liu, Kexiang Wang, Baobao Chang, Zhifang Sui

Distant-supervised relation extraction inevitably suffers from wrong labeling problems because it heuristically labels relational facts with knowledge bases.

Relation Extraction

Order-Planning Neural Text Generation From Structured Data

1 code implementation1 Sep 2017 Lei Sha, Lili Mou, Tianyu Liu, Pascal Poupart, Sujian Li, Baobao Chang, Zhifang Sui

Generating texts from structured data (e. g., a table) is important for various natural language processing tasks such as question answering and dialog systems.

Question Answering Table-to-Text Generation

Towards Time-Aware Knowledge Graph Completion

no code implementations COLING 2016 Tingsong Jiang, Tianyu Liu, Tao Ge, Lei Sha, Baobao Chang, Sujian Li, Zhifang Sui

In this paper, we present a novel time-aware knowledge graph completion model that is able to predict links in a KG using both the existing facts and the temporal information of the facts.

Knowledge Graph Completion Question Answering +1

Cannot find the paper you are looking for? You can Submit a new open access paper.