no code implementations • 23 Feb 2024 • Jongyoon Song, Nohil Park, Bongkyu Hwang, Jaewoong Yun, Seongho Joe, Youngjune L. Gwon, Sungroh Yoon
Abstractive summarization models often generate factually inconsistent content particularly when the parametric knowledge of the model conflicts with the knowledge in the input document.
no code implementations • 20 Apr 2023 • Hyunjin Choi, Hyunjae Lee, Seongho Joe, Youngjune L. Gwon
Embeddings for a particular modality of data occupy a high-dimensional space of its own, but it can be semantically aligned to another by a simple mapping without training a deep neural net.
no code implementations • 1 Sep 2022 • Hyunjae Lee, Jaewoong Yun, Hyunjin Choi, Seongho Joe, Youngjune L. Gwon
We build and fine-tune an abstractive dialogue summarization model on a shared encoder-decoder architecture using the enhanced BERT.
Abstractive Dialogue Summarization Abstractive Text Summarization +4
no code implementations • 12 Dec 2018 • Miriam Cha, Youngjune L. Gwon, H. T. Kung
Instead of selecting random training examples, we perform negative sampling based on the semantic distance from a positive example in the class.