no code implementations • 8 May 2019 • Jaehwan Lee, Donggeon Yoo, Jung Yin Huh, Hyo-Eun Kim
The label distillation, a type of pseudo-label techniques, is intended to mitigate the grading variation.
1 code implementation • 26 Mar 2019 • HyunJae Lee, Hyo-Eun Kim, Hyeonseob Nam
Following the advance of style transfer with Convolutional Neural Networks (CNNs), the role of styles in CNNs has drawn growing attention from a broader perspective.
Ranked #129 on Image Classification on CIFAR-10
no code implementations • 28 May 2018 • Hyo-Eun Kim, SeungWook Kim, Jaehwan Lee
Data is one of the most important factors in machine learning.
3 code implementations • NeurIPS 2018 • Hyeonseob Nam, Hyo-Eun Kim
Real-world image recognition is often challenged by the variability of visual styles including object textures, lighting conditions, filter effects, etc.
no code implementations • 4 Nov 2016 • Hyo-Eun Kim, Sangheum Hwang, Kyunghyun Cho
From the base model, we introduce a semantic noise modeling method which enables class-conditional perturbation on latent space to enhance the representational power of learned latent feature.
no code implementations • 16 Feb 2016 • Hyo-Eun Kim, Sangheum Hwang
The unpooling-deconvolution combination helps to eliminate less discriminative features in a feature extraction stage, since output features of the deconvolution layer are reconstructed from the most discriminative unpooled features instead of the raw one.
no code implementations • 4 Feb 2016 • Sangheum Hwang, Hyo-Eun Kim
With the help of transfer learning which adopts weight parameters of a pre-trained network, the weakly supervised learning framework for object localization performs well because the pre-trained network already has well-trained class-specific features.