Search Results for author: Yeonsik Jo

Found 6 papers, 2 papers with code

Universal Noise Annotation: Unveiling the Impact of Noisy annotation on Object Detection

1 code implementation21 Dec 2023 Kwangrok Ryoo, Yeonsik Jo, Seungjun Lee, Mira Kim, Ahra Jo, Seung Hwan Kim, Seungryong Kim, Soonyoung Lee

For object detection task with noisy labels, it is important to consider not only categorization noise, as in image classification, but also localization noise, missing annotations, and bogus bounding boxes.

Image Classification Object +2

Misalign, Contrast then Distill: Rethinking Misalignments in Language-Image Pretraining

no code implementations19 Dec 2023 Bumsoo Kim, Yeonsik Jo, Jinhyung Kim, Seung Hwan Kim

Contrastive Language-Image Pretraining has emerged as a prominent approach for training vision and text encoders with uncurated image-text pairs from the web.

Image Augmentation Metric Learning +1

Expediting Contrastive Language-Image Pretraining via Self-distilled Encoders

no code implementations19 Dec 2023 Bumsoo Kim, Jinhyung Kim, Yeonsik Jo, Seung Hwan Kim

Based on the unified text embedding space, ECLIPSE compensates for the additional computational cost of the momentum image encoder by expediting the online image encoder.

Knowledge Distillation

Misalign, Contrast then Distill: Rethinking Misalignments in Language-Image Pre-training

no code implementations ICCV 2023 Bumsoo Kim, Yeonsik Jo, Jinhyung Kim, Seunghwan Kim

Contrastive Language-Image Pretraining has emerged as a prominent approach for training vision and text encoders with uncurated image-text pairs from the web.

Image Augmentation Metric Learning +1

Incremental Learning with Maximum Entropy Regularization: Rethinking Forgetting and Intransigence

no code implementations3 Feb 2019 Dahyun Kim, Jihwan Bae, Yeonsik Jo, Jonghyun Choi

Incremental learning suffers from two challenging problems; forgetting of old knowledge and intransigence on learning new knowledge.

Incremental Learning Transfer Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.