no code implementations • 9 Aug 2019 • Dongjun Kim, Tae-Sub Yun, Il-Chul Moon
While this parameter calibration has been fixed throughout a simulation execution, this paper expands the static parameter calibration in two dimensions: dynamic calibration and heterogeneous calibration.
no code implementations • 4 Mar 2020 • Weonyoung Joo, Dongjun Kim, Seungjae Shin, Il-Chul Moon
Stochastic gradient estimators of discrete random variables are widely explored, for example, Gumbel-Softmax reparameterization trick for Bernoulli and categorical distributions.
no code implementations • 13 Apr 2020 • Dongjun Kim, Weonyoung Joo, Seungjae Shin, Kyungwoo Song, Il-Chul Moon
Generative Adversarial Network (GAN) can be viewed as an implicit estimator of a data distribution, and this perspective motivates using the adversarial concept in the true input parameter estimation of black-box generators.
no code implementations • CVPR 2020 • Junsoo Lee, Eungyeup Kim, Yunsung Lee, Dongjun Kim, Jaehyuk Chang, Jaegul Choo
However, it is difficult to prepare for a training data set that has a sufficient amount of semantically meaningful pairs of images as well as the ground truth for a colored image reflecting a given reference (e. g., coloring a sketch of an originally blue car given a reference green car).
no code implementations • 11 Jun 2020 • Kyungwoo Song, Yohan Jung, Dongjun Kim, Il-Chul Moon
For the attention in Transformer and GAT, we derive that the attention is a product of two parts: 1) the RBF kernel to measure the similarity of two instances and 2) the exponential of $L^{2}$ norm to compute the importance of individual instances.
1 code implementation • 15 Oct 2020 • Dongjun Kim, Kyungwoo Song, YoonYeong Kim, Yongjin Shin, Wanmo Kang, Il-Chul Moon, Weonyoung Joo
This paper introduces a new sampling approach, called Neural Proposal (NP), of the simulation input that resolves the biased data collection as it guarantees the i. i. d.
1 code implementation • 15 Feb 2021 • Dongjun Kim, Kyungwoo Song, Seungjae Shin, Wanmo Kang, Il-Chul Moon, Weonyoung Joo
A simulation is useful when the phenomenon of interest is either expensive to regenerate or irreproducible with the same context.
1 code implementation • 10 Jun 2021 • Dongjun Kim, Seungjae Shin, Kyungwoo Song, Wanmo Kang, Il-Chul Moon
This paper investigates with sufficient empirical evidence that such inverse correlation happens because density estimation is significantly contributed by small diffusion time, whereas sample generation mainly depends on large diffusion time.
Ranked #2 on Image Generation on CIFAR-10 (Inception score metric)
no code implementations • 29 Sep 2021 • Dongjun Kim, Seungjae Shin, Kyungwoo Song, Wanmo Kang, Il-Chul Moon
From the theory side, the difficulty arises in estimating the high precision diffusion because the data score goes to $\infty$ as $t \rightarrow 0$ of the diffusion time.
no code implementations • 29 Sep 2021 • Dongjun Kim, Byeonghu Na, Se Jung Kwon, Dongsoo Lee, Wanmo Kang, Il-Chul Moon
Specifically, PDM utilizes the flow to non-linearly transform a data variable into a latent variable, and PDM applies the diffusion process to the transformed latent distribution with the linear diffusing mechanism.
no code implementations • 7 Mar 2022 • Dongjun Kim, Tae-Sub Yun, Il-Chul Moon, Jang Won Bae
Agent-based models (ABMs) highlight the importance of simulation validation, such as qualitative face validation and quantitative empirical validation.
1 code implementation • 27 May 2022 • Dongjun Kim, Byeonghu Na, Se Jung Kwon, Dongsoo Lee, Wanmo Kang, Il-Chul Moon
Whereas diverse variations of diffusion models exist, extending the linear diffusion into a nonlinear diffusion process is investigated by very few works.
Ranked #4 on Image Generation on CelebA 64x64
2 code implementations • 28 Nov 2022 • Dongjun Kim, Yeongmin Kim, Se Jung Kwon, Wanmo Kang, Il-Chul Moon
In sample generation, we add an auxiliary term to the pre-trained score to deceive the discriminator.
Ranked #1 on Conditional Image Generation on CIFAR-10
1 code implementation • 1 Oct 2023 • Dongjun Kim, Chieh-Hsin Lai, Wei-Hsiang Liao, Naoki Murata, Yuhta Takida, Toshimitsu Uesaka, Yutong He, Yuki Mitsufuji, Stefano Ermon
Consistency Models (CM) (Song et al., 2023) accelerate score-based diffusion model sampling at the cost of sample quality but lack a natural way to trade-off quality for speed.
Ranked #1 on Image Generation on CIFAR-10
no code implementations • 28 Nov 2023 • Yutong He, Naoki Murata, Chieh-Hsin Lai, Yuhta Takida, Toshimitsu Uesaka, Dongjun Kim, Wei-Hsiang Liao, Yuki Mitsufuji, J. Zico Kolter, Ruslan Salakhutdinov, Stefano Ermon
Despite the recent advancements, conditional image generation still faces challenges of cost, generalizability, and the need for task-specific training.
1 code implementation • 2 Mar 2024 • Yeongmin Kim, Byeonghu Na, Minsang Park, JoonHo Jang, Dongjun Kim, Wanmo Kang, Il-Chul Moon
While directly applying it to score-matching is intractable, we discover that using the time-dependent density ratio both for reweighting and score correction can lead to a tractable form of the objective function to regenerate the unbiased data density.