no code implementations • EMNLP 2021 • Rujun Han, I-Hung Hsu, Jiao Sun, Julia Baylon, Qiang Ning, Dan Roth, Nanyun Peng
While these tasks partially evaluate machines’ ability of narrative understanding, human-like reading comprehension requires the capability to process event-based information beyond arguments and temporal reasoning.
no code implementations • 11 Mar 2025 • Zhen Tan, Jun Yan, I-Hung Hsu, Rujun Han, Zifeng Wang, Long T. Le, Yiwen Song, Yanfei Chen, Hamid Palangi, George Lee, Anand Iyer, Tianlong Chen, Huan Liu, Chen-Yu Lee, Tomas Pfister
Large Language Models (LLMs) have made significant progress in open-ended dialogue, yet their inability to retain and retrieve relevant information from long-term interactions limits their effectiveness in applications requiring sustained personalization.
no code implementations • 29 Nov 2024 • Justin Chih-Yao Chen, Zifeng Wang, Hamid Palangi, Rujun Han, Sayna Ebrahimi, Long Le, Vincent Perot, Swaroop Mishra, Mohit Bansal, Chen-Yu Lee, Tomas Pfister
Reverse thinking plays a crucial role in human reasoning.
no code implementations • 15 Oct 2024 • Wenda Xu, Rujun Han, Zifeng Wang, Long T. Le, Dhruv Madeka, Lei LI, William Yang Wang, Rishabh Agarwal, Chen-Yu Lee, Tomas Pfister
To address these limitations, we introduce Speculative Knowledge Distillation (SKD), a novel approach that leverages cooperation between student and teacher models to generate high-quality training data on-the-fly while aligning with the student's inference-time distribution.
1 code implementation • 31 Jul 2024 • Zhengxuan Wu, Yuhao Zhang, Peng Qi, Yumo Xu, Rujun Han, Yian Zhang, Jifan Chen, Bonan Min, Zhiheng Huang
Surprisingly, we find that less is more, as training ReSet with high-quality, yet substantially smaller data (three-fold less) yields superior results.
1 code implementation • 19 Jul 2024 • Rujun Han, Yuhao Zhang, Peng Qi, Yumo Xu, Jenyuan Wang, Lan Liu, William Yang Wang, Bonan Min, Vittorio Castelli
Question answering based on retrieval augmented generation (RAG-QA) is an important research topic in NLP and has a wide range of real-world applications.
1 code implementation • 12 May 2023 • Sarik Ghazarian, Yijia Shao, Rujun Han, Aram Galstyan, Nanyun Peng
We take the first step by focusing on event commonsense that considers events and their relations, and is crucial in both dialogues and general commonsense reasoning.
2 code implementations • 16 Oct 2022 • Hong Chen, Rujun Han, Te-Lin Wu, Hideki Nakayama, Nanyun Peng
This task requires machines to 1) understand long text inputs and 2) produce a globally consistent image sequence that illustrates the contents of the story.
1 code implementation • NAACL 2022 • Rujun Han, Hong Chen, Yufei Tian, Nanyun Peng
Stories or narratives are comprised of a sequence of events.
1 code implementation • 16 Apr 2021 • Rujun Han, I-Hung Hsu, Jiao Sun, Julia Baylon, Qiang Ning, Dan Roth, Nanyun Peng
While these tasks partially evaluate machines' ability of narrative understanding, human-like reading comprehension requires the capability to process event-based information beyond arguments and temporal reasoning.
no code implementations • EACL 2021 • Rujun Han, Luca Soldaini, Alessandro Moschitti
In this work, we present an approach to efficiently incorporate contextual information in AS2 models.
Machine Reading Comprehension
Open-Domain Question Answering
+1
1 code implementation • NAACL 2021 • Mingyu Derek Ma, Jiao Sun, Mu Yang, Kung-Hsiang Huang, Nuan Wen, Shikhar Singh, Rujun Han, Nanyun Peng
We present EventPlus, a temporal event understanding pipeline that integrates various state-of-the-art event understanding components including event trigger and type detection, event argument detection, event duration and temporal relation extraction.
2 code implementations • EMNLP 2021 • Rujun Han, Xiang Ren, Nanyun Peng
While pre-trained language models (PTLMs) have achieved noticeable success on many NLP tasks, they still struggle for tasks that require event temporal reasoning, which is essential for event-centric applications.
Ranked #1 on
Question Answering
on Torque
2 code implementations • 16 Dec 2020 • Yichao Zhou, Yu Yan, Rujun Han, J. Harry Caufield, Kai-Wei Chang, Yizhou Sun, Peipei Ping, Wei Wang
There has been a steady need in the medical community to precisely extract the temporal relations between clinical events.
1 code implementation • EMNLP 2020 • Rujun Han, Yichao Zhou, Nanyun Peng
Extracting event temporal relations is a critical task for information extraction and plays an important role in natural language understanding.
no code implementations • EMNLP 2020 • Qiang Ning, Hao Wu, Rujun Han, Nanyun Peng, Matt Gardner, Dan Roth
A critical part of reading is being able to understand the temporal relationships between events described in a passage of text, even when those relationships are not explicitly stated.
Ranked #2 on
Question Answering
on Torque
1 code implementation • CONLL 2019 • Rujun Han, I-Hung Hsu, Mu Yang, Aram Galstyan, Ralph Weischedel, Nanyun Peng
We propose a novel deep structured learning framework for event temporal relation extraction.
no code implementations • IJCNLP 2019 • Rujun Han, Qiang Ning, Nanyun Peng
We propose a joint event and temporal relation extraction model with shared representation learning and structured prediction.
Event Extraction
Joint Event and Temporal Relation Extraction
+4
no code implementations • 26 Apr 2019 • Rujun Han, Mengyue Liang, Bashar Alhafni, Nanyun Peng
In this work, we establish strong baselines for event temporal relation extraction on two under-explored story narrative datasets: Richer Event Description (RED) and Causal and Temporal Relation Scheme (CaTeRS).
no code implementations • EMNLP 2018 • Rujun Han, Michael Gill, Arthur Spirling, Kyunghyun Cho
Conventional word embedding models do not leverage information from document meta-data, and they do not model uncertainty.