1 code implementation • 8 Jan 2024 • Patara Trirat, Yooju Shin, Junhyeok Kang, Youngeun Nam, Jihye Na, Minyoung Bae, Joeun Kim, Byunghyun Kim, Jae-Gil Lee
Time-series data exists in every corner of real-world systems and services, ranging from satellites in the sky to wearable devices on human bodies.
no code implementations • 14 Dec 2023 • Doyoung Kim, Dongmin Park, Yooju Shin, Jihwan Bang, Hwanjun Song, Jae-Gil Lee
We propose a novel framework DropTop that suppresses the shortcut bias in online continual learning (OCL) while being adaptive to the varying degree of the shortcut bias incurred by continuously changing environment.
1 code implementation • 12 Dec 2023 • Hwanjun Song, Minseok Kim, Jae-Gil Lee
Multi-label classification poses challenges due to imbalanced and noisy labels in training data.
no code implementations • 18 Nov 2023 • Doyoung Kim, Susik Yoon, Dongmin Park, YoungJun Lee, Hwanjun Song, Jihwan Bang, Jae-Gil Lee
We identify the inadequacy of universal and specific prompting in handling these dynamic shifts.
no code implementations • 18 Nov 2023 • Jihwan Bang, Sumyeong Ahn, Jae-Gil Lee
In response to this inquiry, we observe that (1) simply applying a conventional active learning framework to pre-trained VLMs even may degrade performance compared to random selection because of the class imbalance in labeling candidates, and (2) the knowledge of VLMs can provide hints for achieving the balance before labeling.
1 code implementation • European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases 2023 • Youngeun Nam, Patara Trirat, Taeyoon Kim, Youngseop Lee, Jae-Gil Lee
Detecting anomalies in time series has become increasingly challenging as data collection technology develops, especially in realworld communication services, which require contextual information for precise prediction.
1 code implementation • Proceedings of the AAAI Conference on Artificial Intelligence 2023 • Patara Trirat, Youngeun Nam, Taeyoon Kim, Jae-Gil Lee
Here, we show that AnoViz streamlines the process of finding a potential cause of an anomaly with a deeper analysis of anomalous instances, giving explainability to any anomaly detector.
1 code implementation • IEEE Transactions on Intelligent Transportation Systems 2023 • Patara Trirat, Susik Yoon, Jae-Gil Lee
Due to the continuing colossal socio-economic losses caused by traffic accidents, it is of prime importance to precisely forecast the traffic accident risk to reduce future accidents.
1 code implementation • 13 Oct 2022 • Dongmin Park, Yooju Shin, Jihwan Bang, YoungJun Lee, Hwanjun Song, Jae-Gil Lee
Unlabeled data examples awaiting annotations contain open-set noise inevitably.
1 code implementation • 9 Jun 2022 • Susik Yoon, YoungJun Lee, Jae-Gil Lee, Byung Suk Lee
Online anomaly detection from a data stream is critical for the safety and security of many applications but is facing severe challenges due to complex and evolving data streams from IoT devices and cloud-based infrastructures.
1 code implementation • 19 Mar 2022 • Minseok Kim, Hwanjun Song, Yooju Shin, Dongmin Park, Kijung Shin, Jae-Gil Lee
It is featured with an adaptive learning rate for each parameter-interaction pair for inducing a recommender to quickly learn users' up-to-date interest.
no code implementations • 13 Dec 2021 • Steven Euijong Whang, Yuji Roh, Hwanjun Song, Jae-Gil Lee
In this survey, we study the research landscape for data collection and data quality primarily for deep learning applications.
1 code implementation • NeurIPS 2021 • Dongmin Park, Hwanjun Song, Minseok Kim, Jae-Gil Lee
A deep neural network (DNN) has achieved great success in many machine learning tasks by virtue of its high expressive power.
no code implementations • ICLR 2022 • Yooju Shin, Susik Yoon, Sundong Kim, Hwanjun Song, Jae-Gil Lee, Byung Suk Lee
Time-series data are ubiquitous these days, but lack of the labels in time-series data is regarded as a hurdle for its broad applicability.
1 code implementation • Proceedings of the Web Conference 2021 • Patara Trirat, Jae-Gil Lee
Because traffic accidents cause huge social and economic losses, it is of prime importance to precisely predict the traffic accident risk for reducing future accidents.
no code implementations • 9 Dec 2020 • Jin-woo Lee, Jaehoon Oh, Yooju Shin, Jae-Gil Lee, Se-Young Yoon
Federated learning has emerged as a new paradigm of collaborative machine learning; however, it has also faced several challenges such as non-independent and identically distributed(IID) data and high communication cost.
no code implementations • 8 Dec 2020 • Hwanjun Song, Minseok Kim, Dongmin Park, Yooju Shin, Jae-Gil Lee
In the seeding phase, the network is updated using all the samples to collect a seed of clean samples.
no code implementations • 6 Dec 2020 • Jin-woo Lee, Jaehoon Oh, Sungsu Lim, Se-Young Yun, Jae-Gil Lee
Federated learning has emerged as a new paradigm of collaborative machine learning; however, many prior studies have used global aggregation along a star topology without much consideration of the communication scalability or the diurnal property relied on clients' local time variety.
1 code implementation • 16 Jul 2020 • Hwanjun Song, Minseok Kim, Dongmin Park, Yooju Shin, Jae-Gil Lee
Deep learning has achieved remarkable success in numerous domains with help from large amounts of big data.
no code implementations • 19 Nov 2019 • Hwanjun Song, Minseok Kim, Dongmin Park, Jae-Gil Lee
In this paper, we claim that such overfitting can be avoided by "early stopping" training a deep neural network before the noisy labels are severely memorized.
no code implementations • 19 Nov 2019 • Hwanjun Song, Minseok Kim, Sundong Kim, Jae-Gil Lee
Compared with existing batch selection methods, the results showed that Recency Bias reduced the test error by up to 20. 97% in a fixed wall-clock training time.
no code implementations • 23 Oct 2019 • Dongmin Park, Susik Yoon, Hwanjun Song, Jae-Gil Lee
Learning a good distance measure for distance-based classification in time series leads to significant performance improvement in many tasks.
no code implementations • 25 Sep 2019 • Hwanjun Song, Minseok Kim, Dongmin Park, Jae-Gil Lee
In this paper, we claim that such overfitting can be avoided by "early stopping" training a deep neural network before the noisy labels are severely memorized.
1 code implementation • 15 Jun 2019 • Hwanjun Song, Minseok Kim, Jae-Gil Lee
Owing to the extremely high expressive power of deep neural networks, their side effect is to totally memorize training data even when the labels are extremely noisy.
Ranked #14 on Learning with noisy labels on ANIMAL
no code implementations • ICLR 2019 • Hwanjun Song, Sundong Kim, Minseok Kim, Jae-Gil Lee
Neural networks can converge faster with help from a smarter batch selection strategy.
no code implementations • 8 May 2018 • Younghun Song, Jae-Gil Lee
The recent adoption of recurrent neural networks (RNNs) for session modeling has yielded substantial performance gains compared to previous approaches.