Search Results for author: Yuxuan Song

Found 21 papers, 11 papers with code

MolCRAFT: Structure-Based Drug Design in Continuous Parameter Space

1 code implementation18 Apr 2024 Yanru Qu, Keyue Qiu, Yuxuan Song, Jingjing Gong, Jiawei Han, Mingyue Zheng, Hao Zhou, Wei-Ying Ma

Generative models for structure-based drug design (SBDD) have shown promising results in recent years.

Unified Generative Modeling of 3D Molecules via Bayesian Flow Networks

1 code implementation17 Mar 2024 Yuxuan Song, Jingjing Gong, Yanru Qu, Hao Zhou, Mingyue Zheng, Jingjing Liu, Wei-Ying Ma

Advanced generative model (e. g., diffusion model) derived from simplified continuity assumptions of data distribution, though showing promising progress, has been difficult to apply directly to geometry generation applications due to the multi-modality and noise-sensitive nature of molecule geometry.

3D Molecule Generation

Equivariant Flow Matching with Hybrid Probability Transport

no code implementations12 Dec 2023 Yuxuan Song, Jingjing Gong, Minkai Xu, Ziyao Cao, Yanyan Lan, Stefano Ermon, Hao Zhou, Wei-Ying Ma

The generation of 3D molecules requires simultaneously deciding the categorical features~(atom types) and continuous features~(atom coordinates).

Camouflaged Object Detection with Feature Grafting and Distractor Aware

1 code implementation8 Jul 2023 Yuxuan Song, Xinyue Li, Lin Qi

In order to better explore the advantages of the two encoders, we design a cross-attention-based Feature Grafting Module to graft features extracted from Transformer branch into CNN branch, after which the features are aggregated in the Feature Fusion Module.

Object object-detection +1

Particle Based Stochastic Policy Optimization

no code implementations29 Sep 2021 Qiwei Ye, Yuxuan Song, Chang Liu, Fangyun Wei, Tao Qin, Tie-Yan Liu

Stochastic polic have been widely applied for their good property in exploration and uncertainty quantification.

MuJoCo Games Offline RL +2

Follow Your Path: a Progressive Method for Knowledge Distillation

no code implementations20 Jul 2021 Wenxian Shi, Yuxuan Song, Hao Zhou, Bohan Li, Lei LI

However, it has been observed that a converged heavy teacher model is strongly constrained for learning a compact student network and could make the optimization subject to poor local optima.

Knowledge Distillation

Learning from deep model via exploring local targets

no code implementations1 Jan 2021 Wenxian Shi, Yuxuan Song, Hao Zhou, Bohan Li, Lei LI

However, it has been observed that a converged heavy teacher model is strongly constrained for learning a compact student network and could make the optimization subject to poor local optima.

Knowledge Distillation

Improving Maximum Likelihood Training for Text Generation with Density Ratio Estimation

no code implementations12 Jul 2020 Yuxuan Song, Ning Miao, Hao Zhou, Lantao Yu, Mingxuan Wang, Lei LI

Auto-regressive sequence generative models trained by Maximum Likelihood Estimation suffer the exposure bias problem in practical finite sample scenarios.

Density Ratio Estimation Text Generation

Infomax Neural Joint Source-Channel Coding via Adversarial Bit Flip

1 code implementation3 Apr 2020 Yuxuan Song, Minkai Xu, Lantao Yu, Hao Zhou, Shuo Shao, Yong Yu

In this paper, motivated by the inherent connections between neural joint source-channel coding and discrete representation learning, we propose a novel regularization method called Infomax Adversarial-Bit-Flip (IABF) to improve the stability and robustness of the neural joint source-channel coding scheme.

Representation Learning

Improving Unsupervised Domain Adaptation with Variational Information Bottleneck

no code implementations21 Nov 2019 Yuxuan Song, Lantao Yu, Zhangjie Cao, Zhiming Zhou, Jian Shen, Shuo Shao, Wei-Nan Zhang, Yong Yu

Domain adaptation aims to leverage the supervision signal of source domain to obtain an accurate model for target domain, where the labels are not available.

Unsupervised Domain Adaptation

Towards Efficient and Unbiased Implementation of Lipschitz Continuity in GANs

1 code implementation2 Apr 2019 Zhiming Zhou, Jian Shen, Yuxuan Song, Wei-Nan Zhang, Yong Yu

Lipschitz continuity recently becomes popular in generative adversarial networks (GANs).

Lipschitz Generative Adversarial Nets

1 code implementation15 Feb 2019 Zhiming Zhou, Jiadong Liang, Yuxuan Song, Lantao Yu, Hongwei Wang, Wei-Nan Zhang, Yong Yu, Zhihua Zhang

By contrast, Wasserstein GAN (WGAN), where the discriminative function is restricted to 1-Lipschitz, does not suffer from such a gradient uninformativeness problem.

Informativeness

Guiding the One-to-one Mapping in CycleGAN via Optimal Transport

no code implementations15 Nov 2018 Guansong Lu, Zhiming Zhou, Yuxuan Song, Kan Ren, Yong Yu

CycleGAN is capable of learning a one-to-one mapping between two data distributions without paired examples, achieving the task of unsupervised data translation.

Translation

Understanding the Effectiveness of Lipschitz-Continuity in Generative Adversarial Nets

1 code implementation2 Jul 2018 Zhiming Zhou, Yuxuan Song, Lantao Yu, Hongwei Wang, Jiadong Liang, Wei-Nan Zhang, Zhihua Zhang, Yong Yu

In this paper, we investigate the underlying factor that leads to failure and success in the training of GANs.

valid

Cannot find the paper you are looking for? You can Submit a new open access paper.