Search Results for author: Seonhoon Kim

Found 5 papers, 1 papers with code

Self-Distilled Self-Supervised Representation Learning

no code implementations25 Nov 2021 Jiho Jang, Seonhoon Kim, KiYoon Yoo, Jangho Kim, Nojun Kwak

State-of-the-art frameworks in self-supervised learning have recently shown that fully utilizing transformer-based models can lead to performance boost compared to conventional CNN models.

Representation Learning Self-Supervised Learning

Self-supervised pre-training and contrastive representation learning for multiple-choice video QA

no code implementations17 Sep 2020 Seonhoon Kim, Seohyeong Jeong, Eunbyul Kim, Inho Kang, Nojun Kwak

In this paper, we propose novel training schemes for multiple-choice video question answering with a self-supervised pre-training stage and a supervised contrastive learning in the main stage as an auxiliary learning.

Auxiliary Learning Contrastive Learning +3

Textbook Question Answering with Multi-modal Context Graph Understanding and Self-supervised Open-set Comprehension

no code implementations ACL 2019 Daesik Kim, Seonhoon Kim, Nojun Kwak

Moreover, ablation studies validate that both methods of incorporating f-GCN for extracting knowledge from multi-modal contexts and our newly proposed self-supervised learning process are effective for TQA problems.

Open Set Learning Question Answering +2

Semantic Sentence Matching with Densely-connected Recurrent and Co-attentive Information

no code implementations29 May 2018 Seonhoon Kim, Inho Kang, Nojun Kwak

Inspired by DenseNet, a densely connected convolutional network, we propose a densely-connected co-attentive recurrent neural network, each layer of which uses concatenated information of attentive features as well as hidden features of all the preceding recurrent layers.

Natural Language Inference Paraphrase Identification +1

Cannot find the paper you are looking for? You can Submit a new open access paper.