1 code implementation • EMNLP 2021 • Pengfei Yu, Heng Ji, Prem Natarajan
We focus on lifelong event detection as an exemplar case and propose a new problem formulation that is also generalizable to other IE tasks.
1 code implementation • ACL 2022 • Manling Li, Revanth Gangi Reddy, Ziqi Wang, Yi-shyuan Chiang, Tuan Lai, Pengfei Yu, Zixuan Zhang, Heng Ji
To tackle the challenge of accurate and timely communication regarding the COVID-19 pandemic, we present a COVID-19 Claim Radar to automatically extract supporting and refuting claims on a daily basis.
1 code implementation • NAACL (ACL) 2022 • Xinya Du, Zixuan Zhang, Sha Li, Pengfei Yu, Hongwei Wang, Tuan Lai, Xudong Lin, Ziqi Wang, Iris Liu, Ben Zhou, Haoyang Wen, Manling Li, Darryl Hannan, Jie Lei, Hyounghun Kim, Rotem Dror, Haoyu Wang, Michael Regan, Qi Zeng, Qing Lyu, Charles Yu, Carl Edwards, Xiaomeng Jin, Yizhu Jiao, Ghazaleh Kazeminejad, Zhenhailong Wang, Chris Callison-Burch, Mohit Bansal, Carl Vondrick, Jiawei Han, Dan Roth, Shih-Fu Chang, Martha Palmer, Heng Ji
We introduce RESIN-11, a new schema-guided event extraction&prediction framework that can be applied to a large variety of newsworthy scenarios.
no code implementations • 17 Feb 2024 • Jiateng Liu, Pengfei Yu, Yuji Zhang, Sha Li, Zixuan Zhang, Heng Ji
The dynamic nature of real-world information necessitates efficient knowledge editing (KE) in large language models (LLMs) for knowledge updating.
1 code implementation • 15 Feb 2024 • ZiHao Wang, Shuyu Li, Tao Zhang, Qi Wang, Pengfei Yu, Jinyang Luo, Yan Liu, Ming Xi, Kejun Zhang
To this end, we present MuChin, the first open-source music description benchmark in Chinese colloquial language, designed to evaluate the performance of multimodal LLMs in understanding and describing music.
no code implementations • 1 Nov 2023 • Bohui Shen, Wei zhang, Xubiao Liu, Pengfei Yu, Shirui Jiang, Xinchong Shi, Xiangsong Zhang, Xiaoyu Zhou, Weirui Zhang, Bingxuan Li, Qiegen Liu
Meanwhile, the invertible network iteratively estimates the resultant DOPA PET data and compares it to the reference DOPA PET data.
no code implementations • 31 Oct 2023 • Sha Li, Chi Han, Pengfei Yu, Carl Edwards, Manling Li, Xingyao Wang, Yi R. Fung, Charles Yu, Joel R. Tetreault, Eduard H. Hovy, Heng Ji
The recent explosion of performance of large language models (LLMs) has changed the field of Natural Language Processing (NLP) more abruptly and seismically than any other shift in the field's 80-year history.
no code implementations • 29 May 2023 • Pengfei Yu, Heng Ji
To evaluate and address the core challenge, we propose a new task formulation of the information updating task that only requires the provision of an unstructured updating corpus and evaluates the performance of information updating on the generalizability to question-answer pairs pertaining to the updating information.
no code implementations • 19 May 2023 • Tianci Xue, Ziqi Wang, Zhenhailong Wang, Chi Han, Pengfei Yu, Heng Ji
To detect factual inconsistency, RCoT first asks LLMs to reconstruct the problem based on generated solutions.
no code implementations • 13 Sep 2022 • ZiHao Wang, Qihao Liang, Kejun Zhang, Yuxing Wang, Chen Zhang, Pengfei Yu, Yongsheng Feng, Wenbo Liu, Yikai Wang, Yuntai Bao, Yiheng Yang
In this paper, we propose SongDriver, a real-time music accompaniment generation system without logical latency nor exposure bias.
1 code implementation • 9 Nov 2020 • Xiaodan Hu, Pengfei Yu, Kevin Knight, Heng Ji, Bo Li, Honghui Shi
Experiments show that our approach can accurately illustrate 78% textual attributes, which also help MUSE capture the subject in a more creative and expressive way.
no code implementations • 18 Feb 2019 • Yueyao Yu, Pengfei Yu, Wenye Li
Deep learning models are vulnerable to adversarial examples, which poses an indisputable threat to their applications.
1 code implementation • EMNLP 2018 • Xu Han, Hao Zhu, Pengfei Yu, ZiYun Wang, Yuan YAO, Zhiyuan Liu, Maosong Sun
The relation of each sentence is first recognized by distant supervision methods, and then filtered by crowdworkers.
1 code implementation • EMNLP 2018 • Xu Han, Pengfei Yu, Zhiyuan Liu, Maosong Sun, Peng Li
In this paper, we aim to incorporate the hierarchical information of relations for distantly supervised relation extraction and propose a novel hierarchical attention scheme.
no code implementations • 9 Nov 2017 • Yinpei Dai, Zhijian Ou, Dawei Ren, Pengfei Yu
The above observations motivate us to enrich current representation of dialog states and collect a brand new dialog dataset about movies, based upon which we build a new DST, called enriched DST (EDST), for flexible accessing movie information.