1 code implementation • 2 Mar 2024 • Yeongmin Kim, Byeonghu Na, Minsang Park, JoonHo Jang, Dongjun Kim, Wanmo Kang, Il-Chul Moon
While directly applying it to score-matching is intractable, we discover that using the time-dependent density ratio both for reweighting and score correction can lead to a tractable form of the objective function to regenerate the unbiased data density.
1 code implementation • 27 Feb 2024 • Byeonghu Na, Yeongmin Kim, HeeSun Bae, Jung Hyun Lee, Se Jung Kwon, Wanmo Kang, Il-Chul Moon
This paper proposes Transition-aware weighted Denoising Score Matching (TDSM) for training conditional diffusion models with noisy labels, which is the first study in the line of diffusion models.
1 code implementation • Proceedings of the 40th International Conference on Machine Learning 2023 • Yoon-Yeong Kim, Youngjae Cho, JoonHo Jang, Byeonghu Na, Yeongmin Kim, Kyungwoo Song, Wanmo Kang, Il-Chul Moon
Specifically, our proposed method, Sharpness-Aware Active Learning (SAAL), constructs its acquisition function by selecting unlabeled instances whose perturbed loss becomes maximum.
2 code implementations • 28 Nov 2022 • Dongjun Kim, Yeongmin Kim, Se Jung Kwon, Wanmo Kang, Il-Chul Moon
In sample generation, we add an auxiliary term to the pre-trained score to deceive the discriminator.
Ranked #1 on Conditional Image Generation on CIFAR-10
no code implementations • 26 Oct 2022 • Yeongmin Kim, Huiwon Jang, DongKeon Lee, Ho-Jin Choi
To break through these observations, we propose a simple solution AltUB which introduces alternating training to update the base distribution of normalizing flow for anomaly detection.
Ranked #2 on Anomaly Detection on BTAD (using extra training data)