1 code implementation • 6 Mar 2023 • Hankook Lee, Jongheon Jeong, Sejun Park, Jinwoo Shin
To enable the joint training of EBM and CRL, we also design a new class of latent-variable EBMs for learning the joint density of data and the contrastive latent variable.
1 code implementation • 2 Mar 2023 • Jaehyun Nam, Jihoon Tack, Kyungmin Lee, Hankook Lee, Jinwoo Shin
Learning with few labeled tabular samples is often an essential requirement for industrial machine learning applications as varieties of tabular data suffer from high annotation costs or have difficulties in collecting new samples for novel tasks.
1 code implementation • 6th Workshop on Meta-Learning at NeurIPS 2022 2022 • Huiwon Jang, Hankook Lee, Jinwoo Shin
Unsupervised meta-learning aims to learn generalizable knowledge across a distribution of tasks constructed from unlabeled data.
1 code implementation • 11 Oct 2022 • Jihoon Tack, Jongjin Park, Hankook Lee, Jaeho Lee, Jinwoo Shin
The idea of using a separately trained target model (or teacher) to improve the performance of the student model has been increasingly popular in various machine learning domains, and meta-learning is no exception; a recent discovery shows that utilizing task-wise target models can significantly boost the generalization performance.
1 code implementation • CVPR 2022 • Sukmin Yun, Hankook Lee, Jaehyung Kim, Jinwoo Shin
Despite its simplicity, we demonstrate that it can significantly improve the performance of existing SSL methods for various visual tasks, including object detection and semantic segmentation.
1 code implementation • NeurIPS 2021 • Hankook Lee, Kibok Lee, Kimin Lee, Honglak Lee, Jinwoo Shin
Recent unsupervised representation learning methods have shown to be effective in a range of vision tasks by learning representations invariant to data augmentations such as random cropping and color jittering.
no code implementations • 29 Sep 2021 • Sukmin Yun, Hankook Lee, Jaehyung Kim, Jinwoo Shin
This paper aims to improve their performance further by utilizing the architectural advantages of the underlying neural network, as the current state-of-the-art visual pretext tasks for self-supervised learning do not enjoy the benefit, i. e., they are architecture-agnostic.
1 code implementation • 9 Jun 2021 • Junsu Kim, Sungsoo Ahn, Hankook Lee, Jinwoo Shin
Our main idea is based on a self-improving procedure that trains the model to imitate successful trajectories found by itself.
Ranked #3 on
Multi-step retrosynthesis
on USPTO-190
no code implementations • 3 May 2021 • Hankook Lee, Sungsoo Ahn, Seung-Woo Seo, You Young Song, Eunho Yang, Sung-Ju Hwang, Jinwoo Shin
Retrosynthesis, of which the goal is to find a set of reactants for synthesizing a target product, is an emerging research area of deep learning.
2 code implementations • NeurIPS 2020 • Sungsoo Ahn, Junsu Kim, Hankook Lee, Jinwoo Shin
De novo molecular design attempts to search over the chemical space for molecules with the desired property.
1 code implementation • ICML 2020 • Hankook Lee, Sung Ju Hwang, Jinwoo Shin
Our main idea is to learn a single unified task with respect to the joint distribution of the original and self-supervised labels, i. e., we augment original labels via self-supervision of input transformation.
4 code implementations • 15 May 2019 • Yunhun Jang, Hankook Lee, Sung Ju Hwang, Jinwoo Shin
To address the issue, we propose a novel transfer learning approach based on meta-learning that can automatically learn what knowledge to transfer from the source network to where in the target network.
1 code implementation • 7 Jul 2018 • Hankook Lee, Jinwoo Shin
This is remarkable due to their simplicity and effectiveness, but training many thin sub-networks jointly faces a new challenge on training complexity.