no code implementations • 28 Jul 2024 • Tianming Wang, Ke Wei
We study the problem of robust matrix completion (RMC), where the partially observed entries of an underlying low-rank matrix is corrupted by sparse noise.
no code implementations • 16 Oct 2023 • Jirong Yi, Jingchao Gao, Tianming Wang, Xiaodong Wu, Weiyu Xu
We propose an outlier detection approach for reconstructing the ground-truth signals modeled by generative models under sparse outliers.
no code implementations • 11 Aug 2022 • Lixin Liu, Yanling Wang, Tianming Wang, Dong Guan, Jiawei Wu, Jingxu Chen, Rong Xiao, Wenxiang Zhu, Fei Fang
Therefore, it is crucial to perform cross-domain CTR prediction to transfer knowledge from large domains to small domains to alleviate the data sparsity issue.
no code implementations • 23 Feb 2022 • Kaige Wang, Tianming Wang, Jianchuang Qu, Huatao Jiang, Qing Li, Lin Chang
Firstly, the gap between the low-level vision task represented by rain removal and the high-level vision task represented by object detection is significant, and the low-level vision task can hardly contribute to the high-level vision task.
no code implementations • 15 Sep 2021 • Kaige Wang, Long Chen, Tianming Wang, Qixiang Meng, Huatao Jiang, Lin Chang
Perception plays an important role in reliable decision-making for autonomous vehicles.
no code implementations • 29 Jul 2020 • Tianming Wang, Wen-jie Lu, Huan Yu, Dikai Liu
In this paper, we propose a transfer learning framework that adapts a control policy for excessive disturbance rejection of an underwater robot under dynamics model mismatch.
no code implementations • ACL 2020 • Hanqi Jin, Tianming Wang, Xiaojun Wan
In this paper, we propose a multi-granularity interaction network for extractive and abstractive multi-document summarization, which jointly learn semantic representations for words, sentences, and documents.
no code implementations • ACL 2020 • Shaowei Yao, Tianming Wang, Xiaojun Wan
The graph-to-sequence (Graph2Seq) learning aims to transduce graph-structured representations to word sequences for text generation.
no code implementations • TACL 2020 • Tianming Wang, Xiaojun Wan, Hanqi Jin
Abstract meaning representation (AMR)-to-text generation is the challenging task of generating natural language texts from AMR graphs, where nodes represent concepts and edges denote relations.
2 code implementations • 13 Oct 2019 • HanQin Cai, Jian-Feng Cai, Tianming Wang, Guojian Yin
We study the robust recovery problem for the spectrally sparse signal under the fully observed setting, which is about recovering $\boldsymbol{x}$ and a sparse corruption vector $\boldsymbol{s}$ from their sum $\boldsymbol{z}=\boldsymbol{x}+\boldsymbol{s}$.
no code implementations • 10 Jul 2019 • Tianming Wang, Wen-jie Lu, Zheng Yan, Dikai Liu
This paper presents an observer-integrated Reinforcement Learning (RL) approach, called Disturbance OBserver Network (DOB-Net), for robots operating in environments where disturbances are unknown and time-varying, and may frequently exceed robot control capabilities.
1 code implementation • International Joint Conference on Artificial Intelligence 2019 • Tianming Wang, Xiaojun Wan
Our model uses shared attention layers for encoder and decoder, which make the most of the contextual clues, and a latent variable for learning the distribution of coherent story plots.
no code implementations • 21 Feb 2019 • Chandrajit Bajaj, Tianming Wang
Fusing a low-resolution hyperspectral image (HSI) and a high-resolution multispectral image (MSI) of the same scene leads to a super-resolution image (SRI), which is information rich spatially and spectrally.
no code implementations • 26 Oct 2018 • Jirong Yi, Anh Duc Le, Tianming Wang, Xiaodong Wu, Weiyu Xu
In this paper, we propose a generative model neural network approach for reconstructing the ground truth signals under sparse outliers.
no code implementations • COLING 2016 • Jianmin Zhang, Tianming Wang, Xiaojun Wan
PKUSUMSUM is a Java platform for multilingual document summarization, and it sup-ports multiple languages, integrates 10 automatic summarization methods, and tackles three typical summarization tasks.