no code implementations • 28 Oct 2023 • Boya Zhang, Weijian Luo, Zhihua Zhang
Based on our findings, we propose Purify++, a new diffusion purification algorithm that is now the state-of-the-art purification method against several adversarial attacks.
1 code implementation • NeurIPS 2023 • Shuchen Xue, Mingyang Yi, Weijian Luo, Shifeng Zhang, Jiacheng Sun, Zhenguo Li, Zhi-Ming Ma
Based on our analysis, we propose SA-Solver, which is an improved efficient stochastic Adams method for solving diffusion SDE to generate data with high quality.
Ranked #11 on Image Generation on ImageNet 512x512
no code implementations • 4 Jul 2023 • Weijian Luo, Hao Jiang, Tianyang Hu, Jiacheng Sun, Zhenguo Li, Zhihua Zhang
In image generation experiments, the proposed DCD is capable of training an energy-based model for generating the Celab-A $32\times 32$ dataset, which is comparable to existing EBMs.
no code implementations • 8 Jun 2023 • Weijian Luo, Boya Zhang, Zhihua Zhang
These benchmarks include sampling from 2D targets, Bayesian inference, and sampling from high-dimensional energy-based models (EBMs).
1 code implementation • NeurIPS 2023 • Weijian Luo, Tianyang Hu, Shifeng Zhang, Jiacheng Sun, Zhenguo Li, Zhihua Zhang
To demonstrate the effectiveness and universality of Diff-Instruct, we consider two scenarios: distilling pre-trained diffusion models and refining existing GAN models.
no code implementations • 9 Apr 2023 • Weijian Luo
Our objective is to provide a comprehensible overview of the modern approaches for distilling DMs, starting with an introduction to DMs and a discussion of the challenges involved in distilling them into neural vector fields.
no code implementations • 3 Feb 2023 • Yasong Feng, Weijian Luo, Yimin Huang, Tianyu Wang
We also apply BLiE to search for noise schedule of diffusion models.
no code implementations • 22 Sep 2020 • Weijian Luo, Yongxian Long
A vital problem in solving classification or regression problem is to apply feature engineering and variable selection on data before fed into models. One of a most popular feature engineering method is to discretisize continous variable with some cutting points, which is refered to as bining processing. Good cutting points are important for improving model's ability, because wonderful bining may ignore some noisy variance in continous variable range and keep useful leveled information with good ordered encodings. However, to our best knowledge a majority of cutting point selection is done via researchers domain knownledge or some naive methods like equal-width cutting or equal-frequency cutting. In this paper we propose an end-to-end supervised cutting point selection method based on group and fused lasso along with the automatically variable selection effect. We name our method \textbf{ABM}(automatic bining machine).