1 code implementation • spnlp (ACL) 2022 • Guirong Fu, Zhao Meng, Zhen Han, Zifeng Ding, Yunpu Ma, Matthias Schubert, Volker Tresp, Roger Wattenhofer
In this paper, we tackle the temporal knowledge graph completion task by proposing TempCaps, which is a Capsule network-based embedding model for Temporal knowledge graph completion.
no code implementations • EMNLP 2021 • Zhen Han, Zifeng Ding, Yunpu Ma, Yujia Gu, Volker Tresp
In addition, a novel graph transition layer is applied to capture the transitions on the dynamic graph, i. e., edge formation and dissolution.
no code implementations • 4 Apr 2024 • Shuo Chen, Zhen Han, Bailan He, Zifeng Ding, Wenqian Yu, Philip Torr, Volker Tresp, Jindong Gu
Various jailbreak attacks have been proposed to red-team Large Language Models (LLMs) and revealed the vulnerable safeguards of LLMs.
no code implementations • 22 Feb 2024 • Zefeng Wang, Zhen Han, Shuo Chen, Fan Xue, Zifeng Ding, Xun Xiao, Volker Tresp, Philip Torr, Jindong Gu
Our research evaluates the adversarial robustness of MLLMs when employing CoT reasoning, finding that CoT marginally improves adversarial robustness against existing attack methods.
no code implementations • 27 Nov 2023 • Yan Xia, Letian Shi, Zifeng Ding, João F. Henriques, Daniel Cremers
We tackle the problem of 3D point cloud localization based on a few natural linguistic descriptions and introduce a novel neural network, Text2Loc, that fully interprets the semantic relationship between points and text.
1 code implementation • 15 Nov 2023 • Zifeng Ding, Heling Cai, Jingpei Wu, Yunpu Ma, Ruotong Liao, Bo Xiong, Volker Tresp
We first input the text descriptions of KG relations into large language models (LLMs) for generating relation representations, and then introduce them into embedding-based TKGF methods.
no code implementations • 14 Jul 2023 • Zifeng Ding, Jingcheng Wu, Jingpei Wu, Yan Xia, Volker Tresp
We develop two new benchmark hyper-relational TKG (HTKG) datasets, i. e., Wiki-hy and YAGO-hy, and propose an HTKG reasoning model that efficiently models both temporal facts and qualifiers.
1 code implementation • 2 Apr 2023 • Zifeng Ding, Jingpei Wu, Zongyue Li, Yunpu Ma, Volker Tresp
Most previous TKGC methods only consider predicting the missing links among the entities seen in the training set, while they are unable to achieve great performance in link prediction concerning newly-emerged unseen entities.
no code implementations • 15 Nov 2022 • Zifeng Ding, Jingpei Wu, Bailan He, Yunpu Ma, Zhen Han, Volker Tresp
Similar problem exists in temporal knowledge graphs (TKGs), and no previous temporal knowledge graph completion (TKGC) method is developed for modeling newly-emerged entities.
1 code implementation • 12 Aug 2022 • Zifeng Ding, Zongyue Li, Ruoxia Qi, Jingpei Wu, Bailan He, Yunpu Ma, Zhao Meng, Shuo Chen, Ruotong Liao, Zhen Han, Volker Tresp
To this end, we propose ForecastTKGQA, a TKGQA model that employs a TKG forecasting module for future inference, to answer all three types of questions.
no code implementations • 21 May 2022 • Zifeng Ding, Bailan He, Yunpu Ma, Zhen Han, Volker Tresp
In this paper, we follow the previous work that focuses on few-shot relational learning on static KGs and extend two fundamental TKG reasoning tasks, i. e., interpolated and extrapolated link prediction, to the one-shot setting.
no code implementations • 17 Mar 2022 • Zhen Han, Ruotong Liao, Jindong Gu, Yao Zhang, Zifeng Ding, Yujia Gu, Heinz Köppl, Hinrich Schütze, Volker Tresp
Since conventional knowledge embedding models cannot take full advantage of the abundant textual information, there have been extensive research efforts in enhancing knowledge embedding using texts.
no code implementations • 14 Dec 2021 • Zifeng Ding, Yunpu Ma, Bailan He, Volker Tresp
Knowledge graphs contain rich knowledge about various entities and the relational information among them, while temporal knowledge graphs (TKGs) describe and model the interactions of the entities over time.
1 code implementation • 13 Jan 2021 • Zhen Han, Zifeng Ding, Yunpu Ma, Yujia Gu, Volker Tresp
In addition, a novel graph transition layer is applied to capture the transitions on the dynamic graph, i. e., edge formation and dissolution.