Search Results for author: Jinyoung Yeo

Found 10 papers, 1 papers with code

Mind the Gap! Injecting Commonsense Knowledge for Abstractive Dialogue Summarization

no code implementations2 Sep 2022 Seungone Kim, Se June Joo, Hyungjoo Chae, Chaehyeong Kim, Seung-won Hwang, Jinyoung Yeo

In this paper, we propose to leverage the unique characteristics of dialogues sharing commonsense knowledge across participants, to resolve the difficulties in summarizing them.

Abstractive Dialogue Summarization Multi-Task Learning

Modularized Transfer Learning with Multiple Knowledge Graphs for Zero-shot Commonsense Reasoning

no code implementations NAACL 2022 Yu Jin Kim, Beong-woo Kwak, Youngwook Kim, Reinald Kim Amplayo, Seung-won Hwang, Jinyoung Yeo

Towards this goal, we propose to mitigate the loss of knowledge from the interference among the different knowledge sources, by developing a modular variant of the knowledge aggregation as a new zero-shot commonsense reasoning framework.

Knowledge Graphs Transfer Learning

Dual Task Framework for Improving Persona-grounded Dialogue Dataset

no code implementations11 Feb 2022 Minju Kim, Beong-woo Kwak, Youngwook Kim, Hong-in Lee, Seung-won Hwang, Jinyoung Yeo

This paper introduces a simple yet effective data-centric approach for the task of improving persona-conditioned dialogue agents.

TrustAL: Trustworthy Active Learning using Knowledge Distillation

no code implementations26 Jan 2022 Beong-woo Kwak, Youngwook Kim, Yu Jin Kim, Seung-won Hwang, Jinyoung Yeo

A traditional view of data acquisition is that, through iterations, knowledge from human labels and models is implicitly distilled to monotonically increase the accuracy and label consistency.

Active Learning Knowledge Distillation

Meta-path Free Semi-supervised Learning for Heterogeneous Networks

no code implementations18 Oct 2020 Shin-woo Park, Byung Jun Bae, Jinyoung Yeo, Seung-won Hwang

Graph neural networks (GNNs) have been widely used in representation learning on graphs and achieved superior performance in tasks such as node classification.

Node Classification Representation Learning

Soft Representation Learning for Sparse Transfer

no code implementations ACL 2019 Haeju Park, Jinyoung Yeo, Gengyu Wang, Seung-won Hwang

Transfer learning is effective for improving the performance of tasks that are related, and Multi-task learning (MTL) and Cross-lingual learning (CLL) are important instances.

Multi-Task Learning Representation Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.