1 code implementation • 16 Dec 2022 • Kai Xiong, Xiao Ding, Zhongyang Li, Li Du, Bing Qin, Yi Zheng, Baoxing Huai
Causal chain reasoning (CCR) is an essential ability for many decision-making AI systems, which requires the model to build reliable causal chains by connecting causal pairs.
no code implementations • 23 Sep 2022 • Zhongyang Li, Sichen Yang
We study the community detection problem on a Gaussian mixture model, in which (1) vertices are divided into $k\geq 2$ distinct communities that are not necessarily equally-sized; (2) the Gaussian perturbations for different entries in the observation matrix are not necessarily independent or identically distributed.
no code implementations • 21 Jul 2021 • Zhongyang Li, Xiao Ding, Ting Liu, J. Edward Hu, Benjamin Van Durme
We present a conditional text generation framework that posits sentential expressions of possible causes and effects.
no code implementations • 21 Jul 2021 • Zhongyang Li, Xiao Ding, Kuo Liao, Bing Qin, Ting Liu
Recent work has shown success in incorporating pre-trained models like BERT to improve NLP systems.
no code implementations • SEMEVAL 2020 • Xiao Ding, Dingkui Hao, Yuewei Zhang, Kuo Liao, Zhongyang Li, Bing Qin, Ting Liu
In this task, we dedicate to detecting causation, especially counterfactuals from texts.
no code implementations • 20 Nov 2020 • Zhongyang Li, Fei Lu
In the learning of systems of interacting particles or agents, coercivity condition ensures identifiability of the interaction functions, providing the foundation of learning by nonparametric regression.
no code implementations • 29 Aug 2020 • Zhongyang Li
We study the community detection problem on a Gaussian mixture model, in which vertices are divided into $k\geq 2$ distinct communities.
no code implementations • IJCNLP 2019 • Li Du, Xiao Ding, Ting Liu, Zhongyang Li
Understanding event and event-centered commonsense reasoning are crucial for natural language processing (NLP).
1 code implementation • IJCNLP 2019 • Xiao Ding, Kuo Liao, Ting Liu, Zhongyang Li, Junwen Duan
Prior work has proposed effective methods to learn event representations that can capture syntactic and semantic information over text corpus, demonstrating their effectiveness for downstream tasks such as script event prediction.
no code implementations • 18 Jul 2019 • Xiao Ding, Zhongyang Li, Ting Liu, Kuo Liao
The evolution and development of events have their own basic principles, which make events happen sequentially.
no code implementations • ACL 2019 • Zhongyang Li, Tongfei Chen, Benjamin Van Durme
Researchers illustrate improvements in contextual encoding strategies via resultant performance on a battery of shared Natural Language Understanding (NLU) tasks.
1 code implementation • 17 May 2019 • Zhongyang Li, Xiao Ding, Ting Liu
In this study, we investigate a transferable BERT (TransBERT) training framework, which can transfer not only general language knowledge from large-scale unlabeled data but also specific kinds of knowledge from various semantically related supervised tasks, for a target task.
no code implementations • COLING 2018 • Zhongyang Li, Xiao Ding, Ting Liu
In this paper, we propose using adversarial training augmented Seq2Seq model to generate reasonable and diversified story endings given a story context.
1 code implementation • 14 May 2018 • Zhongyang Li, Xiao Ding, Ting Liu
Script event prediction requires a model to predict the subsequent event given an existing event context.