1 code implementation • ACL 2022 • Yongqi Zhang, Zhanke Zhou, Quanming Yao, Yong Li
Based on the analysis, we propose an efficient two-stage search algorithm KGTuner, which efficiently explores HP configurations on small subgraph at the first stage and transfers the top-performed configurations for fine-tuning on the large full graph at the second stage.
1 code implementation • 21 Mar 2024 • Guangyi Liu, Quanming Yao, Yongqi Zhang, Lei Chen
Recommendation systems, as widely implemented nowadays on various platforms, recommend relevant items to users based on their preferences.
1 code implementation • 15 Mar 2024 • Zhanke Zhou, Yongqi Zhang, Jiangchao Yao, Quanming Yao, Bo Han
To deduce new facts on a knowledge graph (KG), a link predictor learns from the graph structure and collects local evidence to find the answer to a given query.
1 code implementation • 13 Mar 2024 • Fangqi Zhu, Yongqi Zhang, Lei Chen, Bing Qin, Ruifeng Xu
Adverse drug-drug interactions~(DDIs) can compromise the effectiveness of concurrent drug administration, posing a significant challenge in healthcare.
1 code implementation • 15 Nov 2023 • Yongqi Zhang, Quanming Yao, Ling Yue, Xian Wu, Ziheng Zhang, Zhenxi Lin, Yefeng Zheng
Accurately predicting drug-drug interactions (DDI) for emerging drugs, which offer possibilities for treating and alleviating diseases, with computational methods can improve patient care and contribute to efficient drug development.
no code implementations • 20 Oct 2023 • Hansi Yang, Yongqi Zhang, Quanming Yao, James Kwok
We also propose a regularizer to align the model with graph structure.
2 code implementations • 13 Oct 2023 • Ling Yue, Yongqi Zhang, Quanming Yao, Yong Li, Xian Wu, Ziheng Zhang, Zhenxi Lin, Yefeng Zheng
Knowledge graph (KG) embedding is a fundamental task in natural language processing, and various methods have been proposed to explore semantic patterns in distinctive ways.
Ranked #1 on Link Property Prediction on ogbl-biokg
no code implementations • 22 Mar 2023 • Haiquan Qiu, Yongqi Zhang, Yong Li, Quanming Yao
Our results first show that GNN can capture logical rules from graded modal logic, providing a new theoretical tool for analyzing the expressiveness of GNN for KG reasoning; and a query labeling trick makes it easier for GNN to capture logical rules, explaining why SOTA methods are mainly based on labeling trick.
no code implementations • 24 Jul 2022 • Hansi Yang, Yongqi Zhang, Quanming Yao
This scoring function, called AutoWeird, only uses tail entity and relation in a triplet to compute its plausibility score.
Ranked #2 on Link Property Prediction on ogbl-wikikg2
2 code implementations • 30 May 2022 • Yongqi Zhang, Zhanke Zhou, Quanming Yao, Xiaowen Chu, Bo Han
An important design component of GNN-based KG reasoning methods is called the propagation path, which contains a set of involved entities in each propagation step.
2 code implementations • 5 May 2022 • Yongqi Zhang, Zhanke Zhou, Quanming Yao, Yong Li
While hyper-parameters (HPs) are important for knowledge graph (KG) learning, existing methods fail to search them efficiently.
3 code implementations • 13 Aug 2021 • Yongqi Zhang, Quanming Yao
In this paper, we introduce a novel relational structure, i. e., relational directed graph (r-digraph), which is composed of overlapped relational paths, to capture the KG's local evidence.
3 code implementations • 1 Jul 2021 • Yongqi Zhang, Quanming Yao, James Tin-Yau Kwok
We first set up a search space for AutoBLM by analyzing existing scoring functions.
Ranked #4 on Link Property Prediction on ogbl-biokg
3 code implementations • 22 Apr 2021 • Shimin Di, Quanming Yao, Yongqi Zhang, Lei Chen
The scoring function, which measures the plausibility of triplets in knowledge graphs (KGs), is the key to ensure the excellent performance of KG embedding, and its design is also an important problem in the literature.
no code implementations • 16 Nov 2020 • Yongqi Zhang, HUI ZHANG, Quanming Yao, Jun Wan
Thus, inspired by the observation that classifier is more robust to noisy labels while representation is much more fragile, and by the recent advances of self-supervised representation learning (SSRL) technologies, we design a new method, i. e., CS$^3$NL, to obtain representation by SSRL without labels and train the classifier directly with noisy labels.
1 code implementation • 24 Oct 2020 • Yongqi Zhang, Quanming Yao, Lei Chen
In this paper, motivated by the observation that negative triplets with large gradients are important but rare, we propose to directly keep track of them with the cache.
4 code implementations • NeurIPS 2020 • Yongqi Zhang, Quanming Yao, Lei Chen
In this work, based on the relational paths, which are composed of a sequence of triplets, we define the Interstellar as a recurrent neural architecture search problem for the short-term and long-term information along the paths.
3 code implementations • 26 Apr 2019 • Yongqi Zhang, Quanming Yao, Wenyuan Dai, Lei Chen
The algorithm is further sped up by a filter and a predictor, which can avoid repeatedly training SFs with same expressive ability and help removing bad candidates during the search before model training.
Ranked #2 on Link Prediction on FB15k
6 code implementations • 16 Dec 2018 • Yongqi Zhang, Quanming Yao, Yingxia Shao, Lei Chen
Negative sampling, which samples negative triplets from non-observed ones in the training data, is an important step in KG embedding.
Ranked #5 on Link Prediction on FB15k
1 code implementation • 31 Oct 2018 • Zhenqian Shen, Yongqi Zhang, Lanning Wei, Huan Zhao, Quanming Yao
Machine learning (ML) methods have been developing rapidly, but configuring and selecting proper methods to achieve a desired performance is increasingly difficult and tedious.
no code implementations • 18 May 2018 • Yongqi Zhang
To learn the complex relationship between the two domains, we introduce an additional variable to control the variations in our one-to-many mapping.