Search Results for author: Dongmin Park

Found 12 papers, 4 papers with code

Mitigating Dialogue Hallucination for Large Multi-modal Models via Adversarial Instruction Tuning

no code implementations15 Mar 2024 Dongmin Park, Zhaofang Qian, Guangxing Han, Ser-Nam Lim

To precisely measure this, we first present an evaluation benchmark by extending popular multi-modal benchmark datasets with prepended hallucinatory dialogues generated by our novel Adversarial Question Generator, which can automatically generate image-related yet adversarial dialogues by adopting adversarial attacks on LMMs.

Hallucination Instruction Following +1

Adaptive Shortcut Debiasing for Online Continual Learning

no code implementations14 Dec 2023 Doyoung Kim, Dongmin Park, Yooju Shin, Jihwan Bang, Hwanjun Song, Jae-Gil Lee

We propose a novel framework DropTop that suppresses the shortcut bias in online continual learning (OCL) while being adaptive to the varying degree of the shortcut bias incurred by continuously changing environment.

Continual Learning

Meta-Learning for Online Update of Recommender Systems

1 code implementation19 Mar 2022 Minseok Kim, Hwanjun Song, Yooju Shin, Dongmin Park, Kijung Shin, Jae-Gil Lee

It is featured with an adaptive learning rate for each parameter-interaction pair for inducing a recommender to quickly learn users' up-to-date interest.

Meta-Learning Recommendation Systems

Task-Agnostic Undesirable Feature Deactivation Using Out-of-Distribution Data

1 code implementation NeurIPS 2021 Dongmin Park, Hwanjun Song, Minseok Kim, Jae-Gil Lee

A deep neural network (DNN) has achieved great success in many machine learning tasks by virtue of its high expressive power.

Robust Learning by Self-Transition for Handling Noisy Labels

no code implementations8 Dec 2020 Hwanjun Song, Minseok Kim, Dongmin Park, Yooju Shin, Jae-Gil Lee

In the seeding phase, the network is updated using all the samples to collect a seed of clean samples.

MORPH

Learning from Noisy Labels with Deep Neural Networks: A Survey

1 code implementation16 Jul 2020 Hwanjun Song, Minseok Kim, Dongmin Park, Yooju Shin, Jae-Gil Lee

Deep learning has achieved remarkable success in numerous domains with help from large amounts of big data.

How does Early Stopping Help Generalization against Label Noise?

no code implementations19 Nov 2019 Hwanjun Song, Minseok Kim, Dongmin Park, Jae-Gil Lee

In this paper, we claim that such overfitting can be avoided by "early stopping" training a deep neural network before the noisy labels are severely memorized.

MLAT: Metric Learning for kNN in Streaming Time Series

no code implementations23 Oct 2019 Dongmin Park, Susik Yoon, Hwanjun Song, Jae-Gil Lee

Learning a good distance measure for distance-based classification in time series leads to significant performance improvement in many tasks.

Metric Learning Time Series +1

Prestopping: How Does Early Stopping Help Generalization Against Label Noise?

no code implementations25 Sep 2019 Hwanjun Song, Minseok Kim, Dongmin Park, Jae-Gil Lee

In this paper, we claim that such overfitting can be avoided by "early stopping" training a deep neural network before the noisy labels are severely memorized.

Cannot find the paper you are looking for? You can Submit a new open access paper.