Search Results for author: Dongjun Kim

Found 14 papers, 4 papers with code

Refining Generative Process with Discriminator Guidance in Score-based Diffusion Models

no code implementations28 Nov 2022 Dongjun Kim, Yeongmin Kim, Wanmo Kang, Il-Chul Moon

While the success of diffusion models has been witnessed in various domains, only a few works have investigated the variation of the generative process.

Maximum Likelihood Training of Implicit Nonlinear Diffusion Models

1 code implementation27 May 2022 Dongjun Kim, Byeonghu Na, Se Jung Kwon, Dongsoo Lee, Wanmo Kang, Il-Chul Moon

Whereas diverse variations of diffusion models exist, extending the linear diffusion into a nonlinear diffusion process is investigated by very few works.

Image Generation

Automatic Calibration Framework of Agent-Based Models for Dynamic and Heterogeneous Parameters

no code implementations7 Mar 2022 Dongjun Kim, Tae-Sub Yun, Il-Chul Moon, Jang Won Bae

Agent-based models (ABMs) highlight the importance of simulation validation, such as qualitative face validation and quantitative empirical validation.

High Precision Score-based Diffusion Models

no code implementations29 Sep 2021 Dongjun Kim, Seungjae Shin, Kyungwoo Song, Wanmo Kang, Il-Chul Moon

From the theory side, the difficulty arises in estimating the high precision diffusion because the data score goes to $\infty$ as $t \rightarrow 0$ of the diffusion time.

Image Generation

Maximum Likelihood Training of Parametrized Diffusion Model

no code implementations29 Sep 2021 Dongjun Kim, Byeonghu Na, Se Jung Kwon, Dongsoo Lee, Wanmo Kang, Il-Chul Moon

Specifically, PDM utilizes the flow to non-linearly transform a data variable into a latent variable, and PDM applies the diffusion process to the transformed latent distribution with the linear diffusing mechanism.

Image Generation

Soft Truncation: A Universal Training Technique of Score-based Diffusion Model for High Precision Score Estimation

1 code implementation10 Jun 2021 Dongjun Kim, Seungjae Shin, Kyungwoo Song, Wanmo Kang, Il-Chul Moon

This paper investigates with sufficient empirical evidence that such inverse correlation happens because density estimation is significantly contributed by small diffusion time, whereas sample generation mainly depends on large diffusion time.

 Ranked #1 on Image Generation on CIFAR-10 (Inception score metric)

Density Estimation Image Generation

Neural Posterior Regularization for Likelihood-Free Inference

1 code implementation15 Feb 2021 Dongjun Kim, Kyungwoo Song, Seungjae Shin, Wanmo Kang, Il-Chul Moon, Weonyoung Joo

A simulation is useful when the phenomenon of interest is either expensive to regenerate or irreproducible with the same context.

Bayesian Inference

Generalized Gumbel-Softmax Gradient Estimator for Generic Discrete Random Variables

no code implementations1 Jan 2021 Weonyoung Joo, Dongjun Kim, Seungjae Shin, Il-Chul Moon

Estimating the gradients of stochastic nodes, which enables the gradient descent optimization on neural network parameters, is one of the crucial research questions in the deep generative modeling community.

Topic Models

Sequential Likelihood-Free Inference with Neural Proposal

1 code implementation15 Oct 2020 Dongjun Kim, Kyungwoo Song, YoonYeong Kim, Yongjin Shin, Wanmo Kang, Il-Chul Moon, Weonyoung Joo

This paper introduces a new sampling approach, called Neural Proposal (NP), of the simulation input that resolves the biased data collection as it guarantees the i. i. d.

Bayesian Inference

Implicit Kernel Attention

no code implementations11 Jun 2020 Kyungwoo Song, Yohan Jung, Dongjun Kim, Il-Chul Moon

For the attention in Transformer and GAT, we derive that the attention is a product of two parts: 1) the RBF kernel to measure the similarity of two instances and 2) the exponential of $L^{2}$ norm to compute the importance of individual instances.

Graph Attention Node Classification +2

Reference-Based Sketch Image Colorization using Augmented-Self Reference and Dense Semantic Correspondence

no code implementations CVPR 2020 Junsoo Lee, Eungyeup Kim, Yunsung Lee, Dongjun Kim, Jaehyuk Chang, Jaegul Choo

However, it is difficult to prepare for a training data set that has a sufficient amount of semantically meaningful pairs of images as well as the ground truth for a colored image reflecting a given reference (e. g., coloring a sketch of an originally blue car given a reference green car).

Colorization Image Colorization +1

Adversarial Likelihood-Free Inference on Black-Box Generator

no code implementations13 Apr 2020 Dongjun Kim, Weonyoung Joo, Seungjae Shin, Kyungwoo Song, Il-Chul Moon

Generative Adversarial Network (GAN) can be viewed as an implicit estimator of a data distribution, and this perspective motivates using the adversarial concept in the true input parameter estimation of black-box generators.

Generalized Gumbel-Softmax Gradient Estimator for Various Discrete Random Variables

no code implementations4 Mar 2020 Weonyoung Joo, Dongjun Kim, Seungjae Shin, Il-Chul Moon

Estimating the gradients of stochastic nodes is one of the crucial research questions in the deep generative modeling community, which enables the gradient descent optimization on neural network parameters.

Automatic Calibration of Dynamic and Heterogeneous Parameters in Agent-based Model

no code implementations9 Aug 2019 Dongjun Kim, Tae-Sub Yun, Il-Chul Moon

While this parameter calibration has been fixed throughout a simulation execution, this paper expands the static parameter calibration in two dimensions: dynamic calibration and heterogeneous calibration.

Cannot find the paper you are looking for? You can Submit a new open access paper.