no code implementations • 17 Jul 2024 • Yong-Hyun Park, Sangdoo Yun, Jin-Hwa Kim, Junho Kim, Geonhui Jang, Yonghyun Jeong, Junghyo Jo, Gayoung Lee
In this paper, we propose Direct Unlearning Optimization (DUO), a novel framework for removing Not Safe For Work (NSFW) content from T2I models while preserving their performance on unrelated topics.
no code implementations • 17 Jul 2024 • Yong-Hyun Park, Junghoon Seo, Bomseok Park, Seongsu Lee, Junghyo Jo
Identifying the relevant input features that have a critical influence on the output results is indispensable for the development of explainable artificial intelligence (XAI).
Explainable artificial intelligence Explainable Artificial Intelligence (XAI)
no code implementations • 2 Apr 2024 • Juno Hwang, Yong-Hyun Park, Junghyo Jo
We demonstrate that upsample guidance can be applied to various models, such as pixel-space, latent space, and video diffusion models.
no code implementations • 7 Dec 2023 • Juno Hwang, Yong-Hyun Park, Junghyo Jo
In this paper, we introduce "resolution chromatography" that indicates the signal generation rate of each resolution, which is very helpful concept to mathematically explain this coarse-to-fine behavior in generation process, to understand the role of noise schedule, and to design time-dependent modulation.
1 code implementation • 1 Oct 2023 • Hoyun Choi, Sungyeop Lee, B. Kahng, Junghyo Jo
Neural networks have proven to be efficient surrogate models for tackling partial differential equations (PDEs).
1 code implementation • NeurIPS 2023 • Yong-Hyun Park, Mingi Kwon, Jaewoong Choi, Junghyo Jo, Youngjung Uh
Remarkably, our discovered local latent basis enables image editing capabilities by moving $\mathbf{x}_t$, the latent space of DMs, along the basis vector at specific timesteps.
no code implementations • 10 Mar 2023 • Gilhan Kim, Hojun Lee, Junghyo Jo, Yongjoo Baek
In this study, we propose that unsupervised learning generally exhibits a two-component tradeoff of the GE, namely the model error and the data error -- using a more complex model reduces the model error at the cost of the data error, with the data error playing a more significant role for a smaller training dataset.
no code implementations • 24 Feb 2023 • Yong-Hyun Park, Mingi Kwon, Junghyo Jo, Youngjung Uh
Despite the success of diffusion models (DMs), we still lack a thorough understanding of their latent space.
no code implementations • 29 Nov 2022 • Hyungjoon Soh, Dongyeob Kim, Juno Hwang, Junghyo Jo
Mirror descent is an elegant optimization technique that leverages a dual space of parametric models to perform gradient descent.
1 code implementation • 7 Sep 2021 • Sungyeop Lee, Junghyo Jo
We observe that the frequency of internal codes or labels follows power laws in both supervised and unsupervised learning models.
1 code implementation • 15 Feb 2021 • Sungyeop Lee, Junghyo Jo
Thus, we conclude that the compression phase is not necessary for generalization in representation learning.
no code implementations • 28 Jan 2021 • Sangwon Lee, Vipul Periwal, Junghyo Jo
At the initial iteration of the EM algorithm, the model inference shows better model-data consistency with observed data points than with missing data points.
no code implementations • 27 Nov 2020 • Juno Hwang, Wonseok Hwang, Junghyo Jo
The restricted Boltzmann machine (RBM) is a representative generative model based on the concept of statistical mechanics.
no code implementations • 18 May 2020 • Woo Seok Lee, Junghyo Jo, Taegeun Song
Here we apply the machine learning (ML) for the diagnosis of early stage diabetes, which is known as a challenging task in medicine.
no code implementations • 10 Sep 2019 • Junghyo Jo, Danh-Tai Hoang, Vipul Periwal
Maximum Likelihood Estimation (MLE) is the bread and butter of system inference for stochastic systems.
2 code implementations • 14 Jan 2019 • Danh-Tai Hoang, Junghyo Jo, Vipul Periwal
Finally, an important hidden variable problem is to find the number of clusters in a dataset.
Data Analysis, Statistics and Probability Physics and Society
2 code implementations • 10 Aug 2018 • Jin Xu, Junghyo Jo
To address this problem, we examine whether the affinity-based discrimination of peptide sequences is learnable and generalizable by artificial neural networks (ANNs) that process the digital experimental amino acid sequence information of receptors and peptides.
1 code implementation • 13 Dec 2017 • Jin Xu, Junghyo Jo
We examine sequences of 10, 000 human T-cell receptors and 10, 000 antigenic peptides, and obtain a full spectrum of cross-reactivity of the receptor-peptide binding.
no code implementations • 31 Oct 2017 • Juyong Song, Matteo Marsili, Junghyo Jo
The fraction of inputs that are associated to the same state is a natural measure of similarity and is simply related to the cost in bits required to represent these inputs.
2 code implementations • 18 May 2017 • Danh-Tai Hoang, Juyong Song, Vipul Periwal, Junghyo Jo
We introduce a data-driven statistical physics approach to model inference based on minimizing a free energy of data and show superior model recovery for small sample sizes.
Data Analysis, Statistics and Probability Quantitative Methods
no code implementations • 19 Jul 2016 • Dong-Ho Park, Taegeun Song, Danh-Tai Hoang, Jin Xu, Junghyo Jo
By testing all possible motifs governing the interactions of these three cell types, we found that a unique set of positive/negative intra-islet interactions between different islet cell types functions not only to reduce the superficially wasteful zero-sum action of glucagon and insulin but also to enhance/suppress the synchronization of hormone secretions between islets under high/normal glucose conditions.
no code implementations • 12 Dec 2015 • Marissa Pastor, Juyong Song, Danh-Tai Hoang, Junghyo Jo
We developed a new complexity measure of binary patterns, and estimated the minimal network size for memorizing them as a function of their complexity.