no code implementations • EMNLP 2021 • Mojtaba Nayyeri, Chengjin Xu, Franca Hoffmann, Mirza Mohtashim Alam, Jens Lehmann, Sahar Vahdati
Many KGEs use the Euclidean geometry which renders them incapable of preserving complex structures and consequently causes wrong inferences by the models.
1 code implementation • EMNLP 2021 • Chengjin Xu, Fenglong Su, Jens Lehmann
Entity alignment aims to identify equivalent entity pairs between different knowledge graphs (KGs).
1 code implementation • 4 Mar 2022 • Chengjin Xu, Fenglong Su, Jens Lehmann
Entity alignment aims to identify equivalent entity pairs between different knowledge graphs (KGs).
no code implementations • 18 Feb 2022 • Chengjin Xu, Mojtaba Nayyeri, Yung-Yu Chen, Jens Lehmann
In this work, we strive to move beyond the complex or hypercomplex space for KGE and propose a novel geometric algebra based embedding approach, GeomE, which uses multivector representations and the geometric product to model entities and relations.
no code implementations • 29 Sep 2021 • Chengjin Xu, Fenglong Su, Jens Lehmann
Embedding-based representation learning approaches for knowledge graphs (KGs) have been mostly designed for static data.
no code implementations • NAACL 2021 • Chengjin Xu, Yung-Yu Chen, Mojtaba Nayyeri, Jens Lehmann
Representation learning approaches for knowledge graphs have been mostly designed for static data.
no code implementations • 11 Apr 2021 • Chengjin Xu, Mojtaba Nayyeri, Sahar Vahdati, Jens Lehmann
For example, instead of training a model one time with a large embedding size of 1200, we repeat the training of the model 6 times in parallel with an embedding size of 200 and then combine the 6 separate models for testing while the overall numbers of adjustable parameters are same (6*200=1200) and the total memory footprint remains the same.
no code implementations • 13 Oct 2020 • Mojtaba Nayyeri, Chengjin Xu, Jens Lehmann, Sahar Vahdati
To this end, we represent each relation (edge) in a KG as a vector field on a smooth Riemannian manifold.
no code implementations • COLING 2020 • Chengjin Xu, Mojtaba Nayyeri, Yung-Yu Chen, Jens Lehmann
Knowledge graph (KG) embedding aims at embedding entities and relations in a KG into a lowdimensional latent representation space.
2 code implementations • COLING 2020 • Chengjin Xu, Mojtaba Nayyeri, Fouad Alkhoury, Hamed Shariat Yazdi, Jens Lehmann
We show our proposed model overcomes the limitations of the existing KG embedding models and TKG embedding models and has the ability of learning and inferringvarious relation patterns over time.
1 code implementation • 18 Nov 2019 • Chengjin Xu, Mojtaba Nayyeri, Fouad Alkhoury, Hamed Shariat Yazdi, Jens Lehmann
Moreover, considering the temporal uncertainty during the evolution of entity/relation representations over time, we map the representations of temporal KGs into the space of multi-dimensional Gaussian distributions.
no code implementations • 25 Sep 2019 • Mojtaba Nayyeri, Chengjin Xu, Yadollah Yaghoobzadeh, Hamed Shariat Yazdi, Jens Lehmann
We show that by a proper selection of the loss function for training the TransE model, the main limitations of the model are mitigated.
no code implementations • 2 Sep 2019 • Mojtaba Nayyeri, Chengjin Xu, Yadollah Yaghoobzadeh, Hamed Shariat Yazdi, Jens Lehmann
We show that by a proper selection of the loss function for training the TransE model, the main limitations of the model are mitigated.
no code implementations • 20 Aug 2019 • Mojtaba Nayyeri, Chengjin Xu, Jens Lehmann, Hamed Shariat Yazdi
We prove that LogicENN can learn every ground truth of encoded rules in a knowledge graph.
Ranked #16 on
Link Prediction
on FB15k