Search Results for author: Jae-Gil Lee

Found 10 papers, 2 papers with code

Accurate and Fast Federated Learning via IID and Communication-Aware Grouping

no code implementations9 Dec 2020 Jin-woo Lee, Jaehoon Oh, Yooju Shin, Jae-Gil Lee, Se-Young Yoon

Federated learning has emerged as a new paradigm of collaborative machine learning; however, it has also faced several challenges such as non-independent and identically distributed(IID) data and high communication cost.

Federated Learning

Robust Learning by Self-Transition for Handling Noisy Labels

no code implementations8 Dec 2020 Hwanjun Song, Minseok Kim, Dongmin Park, Yooju Shin, Jae-Gil Lee

In the seeding phase, the network is updated using all the samples to collect a seed of clean samples.

TornadoAggregate: Accurate and Scalable Federated Learning via the Ring-Based Architecture

no code implementations6 Dec 2020 Jin-woo Lee, Jaehoon Oh, Sungsu Lim, Se-Young Yun, Jae-Gil Lee

Federated learning has emerged as a new paradigm of collaborative machine learning; however, many prior studies have used global aggregation along a star topology without much consideration of the communication scalability or the diurnal property relied on clients' local time variety.

Federated Learning

Learning from Noisy Labels with Deep Neural Networks: A Survey

1 code implementation16 Jul 2020 Hwanjun Song, Minseok Kim, Dongmin Park, Yooju Shin, Jae-Gil Lee

Deep learning has achieved remarkable success in numerous domains with help from large amounts of big data.

Carpe Diem, Seize the Samples Uncertain "At the Moment" for Adaptive Batch Selection

no code implementations19 Nov 2019 Hwanjun Song, Minseok Kim, Sundong Kim, Jae-Gil Lee

Compared with existing batch selection methods, the results showed that Recency Bias reduced the test error by up to 20. 97% in a fixed wall-clock training time.

How does Early Stopping Help Generalization against Label Noise?

no code implementations19 Nov 2019 Hwanjun Song, Minseok Kim, Dongmin Park, Jae-Gil Lee

In this paper, we claim that such overfitting can be avoided by "early stopping" training a deep neural network before the noisy labels are severely memorized.

MLAT: Metric Learning for kNN in Streaming Time Series

no code implementations23 Oct 2019 Dongmin Park, Susik Yoon, Hwanjun Song, Jae-Gil Lee

Learning a good distance measure for distance-based classification in time series leads to significant performance improvement in many tasks.

Metric Learning Time Series

SELFIE: Refurbishing Unclean Samples for Robust Deep Learning

1 code implementation15 Jun 2019 Hwanjun Song, Minseok Kim, Jae-Gil Lee

Owing to the extremely high expressive power of deep neural networks, their side effect is to totally memorize training data even when the labels are extremely noisy.

Learning with noisy labels

Augmenting Recurrent Neural Networks with High-Order User-Contextual Preference for Session-Based Recommendation

no code implementations8 May 2018 Younghun Song, Jae-Gil Lee

The recent adoption of recurrent neural networks (RNNs) for session modeling has yielded substantial performance gains compared to previous approaches.

Session-Based Recommendations

Cannot find the paper you are looking for? You can Submit a new open access paper.