no code implementations • ICML 2020 • Xianggen Liu, Jian Peng, Qiang Liu, Sen Song
Deep generative modeling has achieved many successes for continuous data generation, such as producing realistic images and controlling their properties (e. g., styles).
1 code implementation • 31 Dec 2023 • Wanlin Cai, Yuxuan Liang, Xianggen Liu, Jianshuai Feng, Yuankai Wu
To bridge this gap, this paper introduces MSGNet, an advanced deep learning model designed to capture the varying inter-series correlations across multiple time scales using frequency domain analysis and adaptive graph convolution.
no code implementations • 25 Nov 2023 • Haotian Luo, Yixin Liu, Peidong Liu, Xianggen Liu
Therefore, we present vector-quantized prompts as the cues to control the generation of pre-trained models.
no code implementations • 19 Nov 2023 • Site Mo, Haoxin Wang, Bixiong Li, Songhai Fan, Yuankai Wu, Xianggen Liu
Time series is a special type of sequence data, a sequence of real-valued random variables collected at even intervals of time.
no code implementations • 19 Sep 2023 • Xianggen Liu, Zhengdong Lu, Lili Mou
Deep learning has largely improved the performance of various natural language processing (NLP) tasks.
no code implementations • 9 May 2023 • Caiyang Yu, Xianggen Liu, Wentao Feng, Chenwei Tang, Jiancheng Lv
Neural Architecture Search (NAS) has emerged as one of the effective methods to design the optimal neural network architecture automatically.
no code implementations • 26 Oct 2021 • Pengyong Li, Jun Wang, Ziliang Li, Yixuan Qiao, Xianggen Liu, Fei Ma, Peng Gao, Seng Song, Guotong Xie
Self-supervised learning has gradually emerged as a powerful technique for graph representation learning.
no code implementations • 1 Oct 2021 • Xianggen Liu, Pengyong Li, Fandong Meng, Hao Zhou, Huasong Zhong, Jie zhou, Lili Mou, Sen Song
The key idea is to integrate powerful neural networks into metaheuristics (e. g., simulated annealing, SA) to restrict the search space in discrete optimization.
no code implementations • 20 Sep 2021 • Xinke Shen, Xianggen Liu, Xin Hu, Dan Zhang, Sen Song
Contrastive learning was employed to minimize the inter-subject differences by maximizing the similarity in EEG signal representations across subjects when they received the same emotional stimuli in contrast to different ones.
1 code implementation • 4 Nov 2020 • Pengyong Li, Yuquan Li, Chang-Yu Hsieh, Shengyu Zhang, Xianggen Liu, Huanxiang Liu, Sen Song, Xiaojun Yao
These advantages have established TrimNet as a powerful and useful computational tool in solving the challenging problem of molecular representation learning.
Ranked #1 on Drug Discovery on MUV
no code implementations • 28 Aug 2020 • Xianggen Liu, Yunan Luo, Sen Song, Jian Peng
Modeling the effects of mutations on the binding affinity plays a crucial role in protein engineering and drug design.
no code implementations • ACL 2020 • Xianggen Liu, Lili Mou, Fandong Meng, Hao Zhou, Jie zhou, Sen Song
Unsupervised paraphrase generation is a promising and important research topic in natural language processing.
no code implementations • 22 Sep 2018 • Huasong Zhong, Xianggen Liu, Yihui He, Yuchun Ma
These three primitives (channel shift, address shift, shortcut shift) can reduce the inference time on GPU while maintains the prediction accuracy.
no code implementations • 6 Jul 2018 • Xianggen Liu, Lili Mou, Haotian Cui, Zhengdong Lu, Sen Song
Both the classification result and when to make the classification are part of the decision process, which is controlled by a policy network and trained with reinforcement learning.
no code implementations • 17 May 2018 • Haotian Cui, Xianggen Liu, Yanhao Huang
The so-called "fail-delay cut-off" refers to the occurrence of N-1 backup protection action on the backbone network of the system, resulting in longer time for the removal of the fault.
no code implementations • ACL 2018 • Zhengdong Lu, Xianggen Liu, Haotian Cui, Yukun Yan, Daqi Zheng
We propose Object-oriented Neural Programming (OONP), a framework for semantically parsing documents in specific domains.