Search Results for author: Sae-Young Chung

Found 10 papers, 5 papers with code

Unsupervised Visual Representation Learning via Mutual Information Regularized Assignment

1 code implementation4 Nov 2022 Dong Hoon Lee, Sungik Choi, Hyunwoo Kim, Sae-Young Chung

This paper proposes Mutual Information Regularized Assignment (MIRA), a pseudo-labeling algorithm for unsupervised representation learning inspired by information maximization.

Pseudo Label Representation Learning +1

Test-Time Adaptation via Self-Training with Nearest Neighbor Information

2 code implementations8 Jul 2022 Minguk Jang, Sae-Young Chung, Hye Won Chung

To overcome this limitation, we propose a novel test-time adaptation method, called Test-time Adaptation via Self-Training with nearest neighbor information (TAST), which is composed of the following procedures: (1) adds trainable adaptation modules on top of the trained feature extractor; (2) newly defines a pseudo-label distribution for the test data by using the nearest neighbor information; (3) trains these modules only a few times during test time to match the nearest neighbor-based pseudo label distribution and a prototype-based class distribution for the test data; and (4) predicts the label of test data using the average predicted class distribution from these modules.

Domain Generalization Pseudo Label +1

Few-Example Clustering via Contrastive Learning

no code implementations8 Jul 2022 Minguk Jang, Sae-Young Chung

We propose Few-Example Clustering (FEC), a novel algorithm that performs contrastive learning to cluster few examples.

Clustering Contrastive Learning

Improving Generalization in Meta-RL with Imaginary Tasks from Latent Dynamics Mixture

1 code implementation NeurIPS 2021 Suyoung Lee, Sae-Young Chung

By training a policy on mixture tasks along with original training tasks, LDM allows the agent to prepare for unseen test tasks during training and prevents the agent from overfitting the training tasks.

Meta Reinforcement Learning reinforcement-learning +1

Novelty Detection Via Blurring

no code implementations ICLR 2020 Sungik Choi, Sae-Young Chung

Conventional out-of-distribution (OOD) detection schemes based on variational autoencoder or Random Network Distillation (RND) have been observed to assign lower uncertainty to the OOD than the target distribution.

Novelty Detection Out of Distribution (OOD) Detection

Robust Training with Ensemble Consensus

no code implementations ICLR 2020 Jisoo Lee, Sae-Young Chung

Since deep neural networks are over-parameterized, they can memorize noisy examples.

Memorization

Fourier Phase Retrieval with Extended Support Estimation via Deep Neural Network

no code implementations3 Apr 2019 Kyung-Su Kim, Sae-Young Chung

We consider the problem of sparse phase retrieval from Fourier transform magnitudes to recover the $k$-sparse signal vector and its support $\mathcal{T}$.

Retrieval

Tree Search Network for Sparse Regression

no code implementations1 Apr 2019 Kyung-Su Kim, Sae-Young Chung

We consider the classical sparse regression problem of recovering a sparse signal $x_0$ given a measurement vector $y = \Phi x_0+w$.

regression

Cannot find the paper you are looking for? You can Submit a new open access paper.