2 code implementations • 12 May 2021 • Sangwoong Yoon, Yung-Kyun Noh, Frank Chongwoo Park
The specific role of the normalization constraint is to ensure that the out-of-distribution (OOD) regime has a small likelihood when samples are learned using maximum likelihood.
1 code implementation • ICML 2018 • Jihun Hamm, Yung-Kyun Noh
Minimax optimization plays a key role in adversarial training of machine learning algorithms, such as learning generative models, domain adaptation, privacy preservation, and robust learning.
1 code implementation • 20 Aug 2022 • Sangwoong Yoon, Jinwon Choi, Yonghyeon LEE, Yung-Kyun Noh, Frank Chongwoo Park
A reliable evaluation method is essential for building a robust out-of-distribution (OOD) detector.
1 code implementation • 24 Oct 2022 • Haanvid Lee, Jongmin Lee, Yunseon Choi, Wonseok Jeon, Byung-Jun Lee, Yung-Kyun Noh, Kee-Eung Kim
We consider local kernel metric learning for off-policy evaluation (OPE) of deterministic policies in contextual bandits with continuous action spaces.
no code implementations • 11 Dec 2017 • Jaeyoon Yoo, Yongjun Hong, Yung-Kyun Noh, Sungroh Yoon
The objective of this study is to train an autonomous navigation model that uses a simulator (instead of real labeled data) and an inexpensive monocular camera.
1 code implementation • 22 May 2018 • J. Jon Ryu, Shouvik Ganguly, Young-Han Kim, Yung-Kyun Noh, Daniel D. Lee
A new approach to $L_2$-consistent estimation of a general density functional using $k$-nearest neighbor distances is proposed, where the functional under consideration is in the form of the expectation of some function $f$ of the densities at each point.
no code implementations • 5 Mar 2015 • Sanghyuk Chun, Yung-Kyun Noh, Jinwoo Shin
Subspace clustering (SC) is a popular method for dimensionality reduction of high-dimensional data, where it generalizes Principal Component Analysis (PCA).
no code implementations • 30 Jun 2014 • Hiroaki Sasaki, Yung-Kyun Noh, Masashi Sugiyama
Estimation of density derivatives is a versatile tool in statistical data analysis.
no code implementations • NeurIPS 2017 • Yung-Kyun Noh, Masashi Sugiyama, Kee-Eung Kim, Frank Park, Daniel D. Lee
This paper shows how metric learning can be used with Nadaraya-Watson (NW) kernel regression.
no code implementations • NeurIPS 2012 • Yung-Kyun Noh, Frank Park, Daniel D. Lee
This paper sheds light on some fundamental connections of the diffusion decision making model of neuroscience and cognitive psychology with k-nearest neighbor classification.
no code implementations • NeurIPS 2010 • Yung-Kyun Noh, Byoung-Tak Zhang, Daniel D. Lee
We consider the problem of learning a local metric to enhance the performance of nearest neighbor classification.
no code implementations • 1 Jan 2021 • Sangwoong Yoon, Yung-Kyun Noh, Frank C. Park
This phenomenon, which we refer to as outlier reconstruction, has a detrimental effect on the use of autoencoders for outlier detection, as an autoencoder will misclassify a clear outlier as being in-distribution.
no code implementations • 16 Mar 2021 • Cheongjae Jang, Sang-Kyun Ko, Yung-Kyun Noh, Jieun Choi, Jongwon Lim, Tae Jeong Kim
The $\text{t}\bar{\text{t}}\text{H}(\text{b}\bar{\text{b}})$ process is an essential channel to reveal the Higgs properties but has an irreducible background from the $\text{t}\bar{\text{t}}\text{b}\bar{\text{b}}$ process, which produces a top quark pair in association with a b quark pair.
no code implementations • 29 Sep 2021 • Sangwoong Yoon, Jinwon Choi, Yonghyeon LEE, Yung-Kyun Noh, Frank C. Park
As an outlier may deviate from the training distribution in unexpected ways, an ideal OOD detector should be able to detect all types of outliers.
no code implementations • 20 Oct 2018 • Seunghyeon Kim, Yung-Kyun Noh, Frank C. Park
In this paper, we investigate learning the deep neural networks for automated optical inspection in industrial manufacturing.
no code implementations • 6 Dec 2023 • Sangwoong Yoon, Dohyun Kwon, Himchan Hwang, Yung-Kyun Noh, Frank C. Park
We present Generalized Contrastive Divergence (GCD), a novel objective function for training an energy-based model (EBM) and a sampler simultaneously.