Search Results for author: Inseop Chung

Found 4 papers, 2 papers with code

Maximizing Cosine Similarity Between Spatial Features for Unsupervised Domain Adaptation in Semantic Segmentation

no code implementations25 Feb 2021 Inseop Chung, Daesik Kim, Nojun Kwak

We propose a novel method that tackles the problem of unsupervised domain adaptation for semantic segmentation by maximizing the cosine similarity between the source and the target domain at the feature level.

Semantic Segmentation Unsupervised Domain Adaptation

Feature-map-level Online Adversarial Knowledge Distillation

no code implementations ICML 2020 Inseop Chung, SeongUk Park, Jangho Kim, Nojun Kwak

By training a network to fool the corresponding discriminator, it can learn the other network's feature map distribution.

Knowledge Distillation

Training Multi-Object Detector by Estimating Bounding Box Distribution for Input Image

3 code implementations ICCV 2021 Jaeyoung Yoo, Hojun Lee, Inseop Chung, Geonseok Seo, Nojun Kwak

Instead of assigning each ground truth to specific locations of network's output, we train a network by estimating the probability density of bounding boxes in an input image using a mixture model.

Density Estimation Real-Time Object Detection

Feature Fusion for Online Mutual Knowledge Distillation

1 code implementation19 Apr 2019 Jangho Kim, Minsung Hyun, Inseop Chung, Nojun Kwak

We propose a learning framework named Feature Fusion Learning (FFL) that efficiently trains a powerful classifier through a fusion module which combines the feature maps generated from parallel neural networks.

Knowledge Distillation

Cannot find the paper you are looking for? You can Submit a new open access paper.