1 code implementation • 19 Apr 2019 • Jangho Kim, Minsung Hyun, Inseop Chung, Nojun Kwak
We propose a learning framework named Feature Fusion Learning (FFL) that efficiently trains a powerful classifier through a fusion module which combines the feature maps generated from parallel neural networks.
3 code implementations • ICCV 2021 • Jaeyoung Yoo, Hojun Lee, Inseop Chung, Geonseok Seo, Nojun Kwak
Instead of assigning each ground truth to specific locations of network's output, we train a network by estimating the probability density of bounding boxes in an input image using a mixture model.
no code implementations • ICML 2020 • Inseop Chung, SeongUk Park, Jangho Kim, Nojun Kwak
By training a network to fool the corresponding discriminator, it can learn the other network's feature map distribution.
no code implementations • 25 Feb 2021 • Inseop Chung, Daesik Kim, Nojun Kwak
We propose a novel method that tackles the problem of unsupervised domain adaptation for semantic segmentation by maximizing the cosine similarity between the source and the target domain at the feature level.
no code implementations • 21 Oct 2021 • Inseop Chung, Jayeon Yoo, Nojun Kwak
It creates a set of pseudo labels for the target domain to give explicit supervision.
no code implementations • 18 May 2022 • Jaeyoung Yoo, Hojun Lee, Seunghyeon Seo, Inseop Chung, Nojun Kwak
Recent end-to-end multi-object detectors simplify the inference pipeline by removing hand-crafted processes such as non-maximum suppression (NMS).
no code implementations • 28 Jun 2022 • Byeonggeun Kim, Seunghan Yang, Inseop Chung, Simyung Chang
We also verify our method on a standard benchmark, miniImageNet, and D-ProtoNets shows the state-of-the-art open-set detection rate in FSOSR.
no code implementations • 28 Jun 2022 • Seunghan Yang, Byeonggeun Kim, Inseop Chung, Simyung Chang
We design two personalized KWS tasks; (1) Target user Biased KWS (TB-KWS) and (2) Target user Only KWS (TO-KWS).
no code implementations • 20 Jul 2022 • Jayeon Yoo, Inseop Chung, Nojun Kwak
Most existing domain adaptive object detection methods exploit adversarial feature alignment to adapt the model to a new domain.
no code implementations • 8 Dec 2023 • Inseop Chung, KiYoon Yoo, Nojun Kwak
To handle this task, the model has to learn a generalizable representation that can be applied to unseen domains while also identify unknown classes that were not present during training.
no code implementations • 12 Dec 2023 • Jayeon Yoo, Dongkwan Lee, Inseop Chung, Donghyun Kim, Nojun Kwak
It is a well-known fact that the performance of deep learning models deteriorates when they encounter a distribution shift at test time.
no code implementations • 2 Mar 2024 • Inseop Chung, Kyomin Hwang, Jayeon Yoo, Nojun Kwak
Continual Test-Time Adaptation (CTA) is a challenging task that aims to adapt a source pre-trained model to continually changing target domains.