Search Results for author: Yiyang Li

Found 12 papers, 8 papers with code

MCNS: Mining Causal Natural Structures Inside Time Series via A Novel Internal Causality Scheme

no code implementations13 Sep 2023 YuanHao Liu, Dehui Du, Zihan Jiang, Anyan Huang, Yiyang Li

To address these challenges, we propose a novel framework called Mining Causal Natural Structure (MCNS), which is automatic and domain-agnostic and helps to find the causal natural structures inside time series via the internal causality scheme.

Causal Inference Time Series +1

Pre-training Multi-party Dialogue Models with Latent Discourse Inference

1 code implementation24 May 2023 Yiyang Li, Xinting Huang, Wei Bi, Hai Zhao

Multi-party dialogues are more difficult for models to understand than one-to-one two-party dialogues, since they involve multiple interlocutors, resulting in interweaving reply-to relations and information flows.

EM Pre-training for Multi-party Dialogue Response Generation

1 code implementation21 May 2023 Yiyang Li, Hai Zhao

Dialogue response generation requires an agent to generate a response according to the current dialogue history, in terms of which two-party dialogues have been well studied, but leaving a great gap for multi-party dialogues at the same time.

Response Generation

Optimization of Velocity Ramps with Survival Analysis for Intersection Merge-Ins

no code implementations13 Mar 2023 Tim Puphal, Malte Probst, Yiyang Li, Yosuke Sakamoto, Julian Eggert

We consider the problem of correct motion planning for T-intersection merge-ins of arbitrary geometry and vehicle density.

Motion Planning Survival Analysis

Just ClozE! A Novel Framework for Evaluating the Factual Consistency Faster in Abstractive Summarization

1 code implementation6 Oct 2022 Yiyang Li, Lei LI, Marina Litvak, Natalia Vanetik, Dingxin Hu, Yuze Li, Yanquan Zhou

The issue of factual consistency in abstractive summarization has received extensive attention in recent years, and the evaluation of factual consistency between summary and document has become an important and urgent task.

Abstractive Text Summarization Language Modelling +2

Semantic-Preserving Adversarial Code Comprehension

1 code implementation COLING 2022 Yiyang Li, Hongqiu Wu, Hai Zhao

Based on the tremendous success of pre-trained language models (PrLMs) for source code comprehension tasks, current literature studies either ways to further improve the performance (generalization) of PrLMs, or their robustness against adversarial attacks.

Motion Prediction for Beating Heart Surgery with GRU

1 code implementation SSRN 2022 Yiyang Li, Bo Yang, Wanruo Zhang, Wenfeng Zheng, Chao Liu

This work aims to predict the 3D coordinates of the point of interest (POI) on the surface of beating heart in dynamic minimally invasive surgery, which can improve the manoeuvrability of cardiac surgical robots and expand their functions.

motion prediction

Back to the Future: Bidirectional Information Decoupling Network for Multi-turn Dialogue Modeling

1 code implementation18 Apr 2022 Yiyang Li, Hai Zhao, Zhuosheng Zhang

Multi-turn dialogue modeling as a challenging branch of natural language understanding (NLU), aims to build representations for machines to understand human dialogues, which provides a solid foundation for multiple downstream tasks.

Natural Language Understanding

Self- and Pseudo-self-supervised Prediction of Speaker and Key-utterance for Multi-party Dialogue Reading Comprehension

1 code implementation Findings (EMNLP) 2021 Yiyang Li, Hai Zhao

Multi-party dialogue machine reading comprehension (MRC) brings tremendous challenge since it involves multiple speakers at one dialogue, resulting in intricate speaker information flows and noisy dialogue contexts.

Machine Reading Comprehension Question Answering

Recurrent multiple shared layers in Depth for Neural Machine Translation

no code implementations23 Aug 2021 Guoliang Li, Yiyang Li

Compared to the deep Transformer(20-layer encoder, 6-layer decoder), our model has similar model performance and infer speed, but our model parameters are 54. 72% of the former.

Machine Translation Translation

Residual Tree Aggregation of Layers for Neural Machine Translation

no code implementations19 Jul 2021 Guoliang Li, Yiyang Li

In transformer, it only uses the top layer of encoder and decoder in the subsequent process, which makes it impossible to take advantage of the useful information in other layers.

Machine Translation Translation

Cannot find the paper you are looking for? You can Submit a new open access paper.