Search Results for author: Jae-Gil Lee

Found 26 papers, 12 papers with code

Universal Time-Series Representation Learning: A Survey

1 code implementation8 Jan 2024 Patara Trirat, Yooju Shin, Junhyeok Kang, Youngeun Nam, Jihye Na, Minyoung Bae, Joeun Kim, Byunghyun Kim, Jae-Gil Lee

Time-series data exists in every corner of real-world systems and services, ranging from satellites in the sky to wearable devices on human bodies.

Feature Engineering Representation Learning +1

Adaptive Shortcut Debiasing for Online Continual Learning

no code implementations14 Dec 2023 Doyoung Kim, Dongmin Park, Yooju Shin, Jihwan Bang, Hwanjun Song, Jae-Gil Lee

We propose a novel framework DropTop that suppresses the shortcut bias in online continual learning (OCL) while being adaptive to the varying degree of the shortcut bias incurred by continuously changing environment.

Continual Learning

Active Prompt Learning in Vision Language Models

no code implementations18 Nov 2023 Jihwan Bang, Sumyeong Ahn, Jae-Gil Lee

In response to this inquiry, we observe that (1) simply applying a conventional active learning framework to pre-trained VLMs even may degrade performance compared to random selection because of the class imbalance in labeling candidates, and (2) the knowledge of VLMs can provide hints for achieving the balance before labeling.

Active Learning

Context-Aware Deep Time-Series Decomposition for Anomaly Detection in Businesses

1 code implementation European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases 2023 Youngeun Nam, Patara Trirat, Taeyoon Kim, Youngseop Lee, Jae-Gil Lee

Detecting anomalies in time series has become increasingly challenging as data collection technology develops, especially in realworld communication services, which require contextual information for precise prediction.

Anomaly Detection Time Series

AnoViz: A Visual Inspection Tool of Anomalies in Multivariate Time Series

1 code implementation Proceedings of the AAAI Conference on Artificial Intelligence 2023 Patara Trirat, Youngeun Nam, Taeyoon Kim, Jae-Gil Lee

Here, we show that AnoViz streamlines the process of finding a potential cause of an anomaly with a deeper analysis of anomalous instances, giving explainability to any anomaly detector.

Time Series

MG-TAR: Multi-View Graph Convolutional Networks for Traffic Accident Risk Prediction

1 code implementation IEEE Transactions on Intelligent Transportation Systems 2023 Patara Trirat, Susik Yoon, Jae-Gil Lee

Due to the continuing colossal socio-economic losses caused by traffic accidents, it is of prime importance to precisely forecast the traffic accident risk to reduce future accidents.

Graph Learning TAR

Adaptive Model Pooling for Online Deep Anomaly Detection from a Complex Evolving Data Stream

1 code implementation9 Jun 2022 Susik Yoon, YoungJun Lee, Jae-Gil Lee, Byung Suk Lee

Online anomaly detection from a data stream is critical for the safety and security of many applications but is facing severe challenges due to complex and evolving data streams from IoT devices and cloud-based infrastructures.

Anomaly Detection

Meta-Learning for Online Update of Recommender Systems

1 code implementation19 Mar 2022 Minseok Kim, Hwanjun Song, Yooju Shin, Dongmin Park, Kijung Shin, Jae-Gil Lee

It is featured with an adaptive learning rate for each parameter-interaction pair for inducing a recommender to quickly learn users' up-to-date interest.

Meta-Learning Recommendation Systems

Data Collection and Quality Challenges in Deep Learning: A Data-Centric AI Perspective

no code implementations13 Dec 2021 Steven Euijong Whang, Yuji Roh, Hwanjun Song, Jae-Gil Lee

In this survey, we study the research landscape for data collection and data quality primarily for deep learning applications.

BIG-bench Machine Learning Fairness +2

Task-Agnostic Undesirable Feature Deactivation Using Out-of-Distribution Data

1 code implementation NeurIPS 2021 Dongmin Park, Hwanjun Song, Minseok Kim, Jae-Gil Lee

A deep neural network (DNN) has achieved great success in many machine learning tasks by virtue of its high expressive power.

Coherence-based Label Propagation over Time Series for Accelerated Active Learning

no code implementations ICLR 2022 Yooju Shin, Susik Yoon, Sundong Kim, Hwanjun Song, Jae-Gil Lee, Byung Suk Lee

Time-series data are ubiquitous these days, but lack of the labels in time-series data is regarded as a hurdle for its broad applicability.

Active Learning Time Series +1

DF-TAR: A Deep Fusion Network for Citywide Traffic Accident Risk Prediction with Dangerous Driving Behavior

1 code implementation Proceedings of the Web Conference 2021 Patara Trirat, Jae-Gil Lee

Because traffic accidents cause huge social and economic losses, it is of prime importance to precisely predict the traffic accident risk for reducing future accidents.

TAR

Accurate and Fast Federated Learning via IID and Communication-Aware Grouping

no code implementations9 Dec 2020 Jin-woo Lee, Jaehoon Oh, Yooju Shin, Jae-Gil Lee, Se-Young Yoon

Federated learning has emerged as a new paradigm of collaborative machine learning; however, it has also faced several challenges such as non-independent and identically distributed(IID) data and high communication cost.

Federated Learning

Robust Learning by Self-Transition for Handling Noisy Labels

no code implementations8 Dec 2020 Hwanjun Song, Minseok Kim, Dongmin Park, Yooju Shin, Jae-Gil Lee

In the seeding phase, the network is updated using all the samples to collect a seed of clean samples.

MORPH

TornadoAggregate: Accurate and Scalable Federated Learning via the Ring-Based Architecture

no code implementations6 Dec 2020 Jin-woo Lee, Jaehoon Oh, Sungsu Lim, Se-Young Yun, Jae-Gil Lee

Federated learning has emerged as a new paradigm of collaborative machine learning; however, many prior studies have used global aggregation along a star topology without much consideration of the communication scalability or the diurnal property relied on clients' local time variety.

Federated Learning

Learning from Noisy Labels with Deep Neural Networks: A Survey

1 code implementation16 Jul 2020 Hwanjun Song, Minseok Kim, Dongmin Park, Yooju Shin, Jae-Gil Lee

Deep learning has achieved remarkable success in numerous domains with help from large amounts of big data.

How does Early Stopping Help Generalization against Label Noise?

no code implementations19 Nov 2019 Hwanjun Song, Minseok Kim, Dongmin Park, Jae-Gil Lee

In this paper, we claim that such overfitting can be avoided by "early stopping" training a deep neural network before the noisy labels are severely memorized.

Carpe Diem, Seize the Samples Uncertain "At the Moment" for Adaptive Batch Selection

no code implementations19 Nov 2019 Hwanjun Song, Minseok Kim, Sundong Kim, Jae-Gil Lee

Compared with existing batch selection methods, the results showed that Recency Bias reduced the test error by up to 20. 97% in a fixed wall-clock training time.

MLAT: Metric Learning for kNN in Streaming Time Series

no code implementations23 Oct 2019 Dongmin Park, Susik Yoon, Hwanjun Song, Jae-Gil Lee

Learning a good distance measure for distance-based classification in time series leads to significant performance improvement in many tasks.

Metric Learning Time Series +1

Prestopping: How Does Early Stopping Help Generalization Against Label Noise?

no code implementations25 Sep 2019 Hwanjun Song, Minseok Kim, Dongmin Park, Jae-Gil Lee

In this paper, we claim that such overfitting can be avoided by "early stopping" training a deep neural network before the noisy labels are severely memorized.

SELFIE: Refurbishing Unclean Samples for Robust Deep Learning

1 code implementation15 Jun 2019 Hwanjun Song, Minseok Kim, Jae-Gil Lee

Owing to the extremely high expressive power of deep neural networks, their side effect is to totally memorize training data even when the labels are extremely noisy.

Learning with noisy labels

Augmenting Recurrent Neural Networks with High-Order User-Contextual Preference for Session-Based Recommendation

no code implementations8 May 2018 Younghun Song, Jae-Gil Lee

The recent adoption of recurrent neural networks (RNNs) for session modeling has yielded substantial performance gains compared to previous approaches.

Session-Based Recommendations

Cannot find the paper you are looking for? You can Submit a new open access paper.