Search Results for author: Hye Won Chung

Found 17 papers, 5 papers with code

Understanding Self-Distillation and Partial Label Learning in Multi-Class Classification with Label Noise

no code implementations16 Feb 2024 Hyeonsu Jeong, Hye Won Chung

By deriving a closed-form solution for the student model's outputs, we discover that SD essentially functions as label averaging among instances with high feature correlations.

Multi-class Classification Partial Label Learning

Efficient Algorithms for Exact Graph Matching on Correlated Stochastic Block Models with Constant Correlation

1 code implementation31 May 2023 Joonhyuk Yang, Dongpil Shin, Hye Won Chung

We consider the problem of graph matching, or learning vertex correspondence, between two correlated stochastic block models (SBMs).

Graph Matching

Detection problems in the spiked matrix models

no code implementations12 Jan 2023 Ji Hyung Jung, Hye Won Chung, Ji Oon Lee

We first show that the principal component analysis can be improved by entrywise pre-transforming the data matrix if the noise is non-Gaussian, generalizing the known results for the spiked random matrix models with rank-1 signals.

Data Valuation Without Training of a Model

1 code implementation3 Jan 2023 Nohyun Ki, Hoyong Choi, Hye Won Chung

Many recent works on understanding deep learning try to quantify how much individual data instances influence the optimization and generalization of a model.

Data Valuation

Recovering Top-Two Answers and Confusion Probability in Multi-Choice Crowdsourcing

no code implementations29 Dec 2022 Hyeonsu Jeong, Hye Won Chung

Under this model, we propose a two-stage inference algorithm to infer both the top two answers and the confusion probability.

Vocal Bursts Valence Prediction

Test-Time Adaptation via Self-Training with Nearest Neighbor Information

2 code implementations8 Jul 2022 Minguk Jang, Sae-Young Chung, Hye Won Chung

To overcome this limitation, we propose a novel test-time adaptation method, called Test-time Adaptation via Self-Training with nearest neighbor information (TAST), which is composed of the following procedures: (1) adds trainable adaptation modules on top of the trained feature extractor; (2) newly defines a pseudo-label distribution for the test data by using the nearest neighbor information; (3) trains these modules only a few times during test time to match the nearest neighbor-based pseudo label distribution and a prototype-based class distribution for the test data; and (4) predicts the label of test data using the average predicted class distribution from these modules.

Domain Generalization Pseudo Label +1

Asymptotic Normality of Log Likelihood Ratio and Fundamental Limit of the Weak Detection for Spiked Wigner Matrices

no code implementations2 Mar 2022 Hye Won Chung, Jiho Lee, Ji Oon Lee

For general non-Gaussian noise, assuming that the signal is drawn from the Rademacher prior, we prove that the log likelihood ratio (LR) of the spiked model against the null model converges to a Gaussian when the signal-to-noise ratio is below a certain threshold.

A Worker-Task Specialization Model for Crowdsourcing: Efficient Inference and Fundamental Limits

1 code implementation19 Nov 2021 Doyeon Kim, Jeonghwan Lee, Hye Won Chung

Inferring correct labels from multiple noisy answers on data, however, has been a challenging problem, since the quality of the answers varies widely across tasks and workers.

Detection of Signal in the Spiked Rectangular Models

no code implementations28 Apr 2021 Ji Hyung Jung, Hye Won Chung, Ji Oon Lee

We show that the principal component analysis can be improved by pre-transforming the matrix entries if the noise is non-Gaussian.

Self-Diagnosing GAN: Diagnosing Underrepresented Samples in Generative Adversarial Networks

1 code implementation NeurIPS 2021 Jinhee Lee, HaeRi Kim, Youngkyu Hong, Hye Won Chung

To promote diversity in sample generation without degrading the overall quality, we propose a simple yet effective method to diagnose and emphasize underrepresented samples during training of a GAN.

Robust Hypergraph Clustering via Convex Relaxation of Truncated MLE

no code implementations23 Mar 2020 Jeonghwan Lee, Daesung Kim, Hye Won Chung

We study hypergraph clustering in the weighted $d$-uniform hypergraph stochastic block model ($d$\textsf{-WHSBM}), where each edge consisting of $d$ nodes from the same community has higher expected weight than the edges consisting of nodes from different communities.

Clustering Stochastic Block Model

Crowdsourced Labeling for Worker-Task Specialization Model

no code implementations21 Mar 2020 Do-Yeon Kim, Hye Won Chung

We consider crowdsourced labeling under a $d$-type worker-task specialization model, where each worker and task is associated with one particular type among a finite set of types and a worker provides a more reliable answer to tasks of the matched type than to tasks of unmatched types.

Clustering Vocal Bursts Type Prediction

Binary Classification with XOR Queries: Fundamental Limits and An Efficient Algorithm

no code implementations31 Jan 2020 Daesung Kim, Hye Won Chung

In particular, we consider the problem of classifying $m$ binary labels with XOR queries that ask whether the number of objects having a given attribute in the chosen subset of size $d$ is even or odd.

Active Learning Attribute +3

Weak Detection in the Spiked Wigner Model with General Rank

no code implementations16 Jan 2020 Ji Hyung Jung, Hye Won Chung, Ji Oon Lee

We study the statistical decision process of detecting the signal from a `signal+noise' type matrix model with an additive Wigner noise.

Vocal Bursts Type Prediction

Shallow Neural Network can Perfectly Classify an Object following Separable Probability Distribution

no code implementations19 Apr 2019 Youngjae Min, Hye Won Chung

This paper constructs shallow sigmoid-type neural networks that achieve 100% accuracy in classification for datasets following a linear separability condition.

Classification General Classification

Weak detection in the spiked Wigner model

no code implementations28 Sep 2018 Hye Won Chung, Ji Oon Lee

We propose a hypothesis test on the presence of the signal by utilizing the linear spectral statistics of the data matrix.

Parity Queries for Binary Classification

no code implementations4 Sep 2018 Hye Won Chung, Ji Oon Lee, Do-Yeon Kim, Alfred O. Hero

We define the query difficulty $\bar{d}$ as the average size of the query subsets and the sample complexity $n$ as the minimum number of measurements required to attain a given recovery accuracy.

Binary Classification Classification +1

Cannot find the paper you are looking for? You can Submit a new open access paper.