Search Results for author: Seonil Son

Found 4 papers, 3 papers with code

Align-to-Distill: Trainable Attention Alignment for Knowledge Distillation in Neural Machine Translation

1 code implementation3 Mar 2024 Heegon Jin, Seonil Son, Jemin Park, Youngseok Kim, Hyungjong Noh, Yeonsoo Lee

The Attention Alignment Module in A2D performs a dense head-by-head comparison between student and teacher attention heads across layers, turning the combinatorial mapping heuristics into a learning problem.

Knowledge Distillation Machine Translation

HaRiM$^+$: Evaluating Summary Quality with Hallucination Risk

2 code implementations22 Nov 2022 Seonil Son, Junsoo Park, Jeong-in Hwang, Junghwa Lee, Hyungjong Noh, Yeonsoo Lee

One of the challenges of developing a summarization model arises from the difficulty in measuring the factual inconsistency of the generated text.

Automated Writing Evaluation Hallucination +1

Learning to Write with Coherence From Negative Examples

no code implementations22 Sep 2022 Seonil Son, Jaeseo Lim, Youwon Jang, Jaeyoung Lee, Byoung-Tak Zhang

We compare our approach with Unlikelihood (UL) training in a text continuation task on commonsense natural language inference (NLI) corpora to show which method better models the coherence by avoiding unlikely continuations.

Natural Language Inference Sentence +1

GLAC Net: GLocal Attention Cascading Networks for Multi-image Cued Story Generation

2 code implementations28 May 2018 Taehyeong Kim, Min-Oh Heo, Seonil Son, Kyoung-Wha Park, Byoung-Tak Zhang

The task of multi-image cued story generation, such as visual storytelling dataset (VIST) challenge, is to compose multiple coherent sentences from a given sequence of images.

Ranked #30 on Visual Storytelling on VIST (METEOR metric)

Sentence Visual Storytelling

Cannot find the paper you are looking for? You can Submit a new open access paper.