1 code implementation • ACL 2022 • Yizhu Liu, Qi Jia, Kenny Zhu
In this paper, we propose a length-aware attention mechanism (LAAM) to adapt the encoding of the source based on the desired length.
1 code implementation • NAACL 2022 • Yizhu Liu, Qi Jia, Kenny Zhu
In this paper, we propose a new automatic reference-free evaluation metric that compares semantic distribution between source document and summary by pretrained language models and considers summary compression ratio.
1 code implementation • 2 Oct 2024 • Qi Jia, Xiang Yue, Shanshan Huang, Ziheng Qin, Yizhu Liu, Bill Yuchen Lin, Yang You
This type of artifact possesses the unique characteristic that identical information can be readily formulated in both texts and images, making them a significant proxy for analyzing modern LLMs' and MLLMs' capabilities in modality-agnostic vision understanding.
no code implementations • 3 Apr 2024 • Yizhu Liu, Ran Tao, Shengyu Guo, Yifan Yang
To tackle above two problems, we first take query concatenated with the query-based summary and the document summary without query as the input of topic relevance model, which can help model learn the relevance degree between query and the core topic of document.
1 code implementation • 18 Oct 2023 • Qi Jia, Siyu Ren, Yizhu Liu, Kenny Q. Zhu
Despite tremendous improvements in natural language generation, summarization models still suffer from the unfaithfulness issue.
1 code implementation • 21 Nov 2022 • Qi Jia, Yizhu Liu, Haifeng Tang, Kenny Q. Zhu
Curriculum learning has shown promising improvements in multiple domains by training machine learning models from easy samples to hard ones.
no code implementations • 18 Oct 2022 • Qi Jia, Yizhu Liu, Siyu Ren, Kenny Q. Zhu
Abstractive dialogue summarization is to generate a concise and fluent summary covering the salient information in a dialogue among two or more interlocutors.
1 code implementation • Findings (NAACL) 2022 • Qi Jia, Yizhu Liu, Haifeng Tang, Kenny Q. Zhu
Previous dialogue summarization techniques adapt large language models pretrained on the narrative text by injecting dialogue-specific features into the models.
1 code implementation • EMNLP 2020 • Qi Jia, Yizhu Liu, Siyu Ren, Kenny Q. Zhu, Haifeng Tang
In this paper, we propose a dialogue extraction algorithm to transform a dialogue history into threads based on their dependency relations.
1 code implementation • EMNLP 2018 • Yizhu Liu, Zhiyi Luo, Kenny Zhu
Convolutional neural networks (CNNs) have met great success in abstractive summarization, but they cannot effectively generate summaries of desired lengths.