1 code implementation • 30 May 2024 • Sangyun Lee, Zinan Lin, Giulia Fanti
In this work, we propose improved techniques for training rectified flows, allowing them to compete with knowledge distillation methods even in the low NFE setting.
no code implementations • 12 Feb 2024 • Kyungha Kim, Sangyun Lee, Kung-Hsiang Huang, Hou Pong Chan, Manling Li, Heng Ji
Fact-checking research has extensively explored verification but less so the generation of natural-language explanations, crucial for user trust.
no code implementations • 2 Oct 2023 • Sangyun Lee, Gayoung Lee, Hyunsu Kim, Junho Kim, Youngjung Uh
We present the Groupwise Diffusion Model (GDM), which divides data into multiple groups and diffuses one group at one time interval in the forward diffusion process.
1 code implementation • 27 Jan 2023 • Sangyun Lee, Beomsu Kim, Jong Chul Ye
Based on the relationship between the forward process and the curvature, here we present an efficient method of training the forward process to minimize the curvature of generative trajectories without any ODE/SDE simulation.
1 code implementation • 16 Jul 2022 • Sangyun Lee, Hyungjin Chung, Jaehyeon Kim, Jong Chul Ye
We further propose a blur diffusion as a special case, where each frequency component of an image is diffused at different speeds.
1 code implementation • 28 Jun 2022 • Sangyun Lee, Gyojung Gu, Sunghyun Park, Seunghwan Choi, Jaegul Choo
Image-based virtual try-on aims to synthesize an image of a person wearing a given clothing item.
Ranked #2 on Virtual Try-on on VITON-HD
1 code implementation • 26 Jan 2022 • Sangyun Lee, Sewoong Ahn, Kwangjin Yoon
Although the goal of training the degradation generator is to approximate the distribution of LR images given a HR image, previous works have heavily relied on the unrealistic assumption that the conditional distribution is a delta function and learned the deterministic mapping from the HR image to a LR image.
no code implementations • 23 Dec 2020 • Youngkyoung Bae, Sangyun Lee, Juin Kim, Hawoong Jeong
Another unique feature of the Langevin description is that rotation is maximized at a particular anisotropy while the stability of the rotation is minimized at a particular anisotropy or mass.
Statistical Mechanics
2 code implementations • 9 Mar 2020 • Dong-Kyum Kim, Youngkyoung Bae, Sangyun Lee, Hawoong Jeong
This Letter presents a neural estimator for entropy production, or NEEP, that estimates entropy production (EP) from trajectories of relevant variables without detailed information on the system dynamics.