1 code implementation • 5 Oct 2024 • Menglin Yang, Aosong Feng, Bo Xiong, Jihong Liu, Irwin King, Rex Ying
Through extensive experiments, we demonstrate that HypLoRA significantly enhances the performance of LLMs on reasoning tasks, particularly for complex reasoning problems.
no code implementations • 3 Jul 2024 • Yu Huang, Min Zhou, Menglin Yang, Zhen Wang, Muhan Zhang, Jie Wang, Hong Xie, Hao Wang, Defu Lian, Enhong Chen
Recent advancements in graph learning have revolutionized the way to understand and analyze data with complex structures.
2 code implementations • 1 Jul 2024 • Menglin Yang, Harshit Verma, Delvin Ce Zhang, Jiahong Liu, Irwin King, Rex Ying
Our experimental results confirm the effectiveness and efficiency of Hypformer across various datasets, demonstrating its potential as an effective and scalable solution for large-scale data representation and large models.
1 code implementation • 17 Jun 2024 • Jiasheng Zhang, Jialin Chen, Menglin Yang, Aosong Feng, Shuang Liang, Jie Shao, Rex Ying
Moreover, we conduct extensive benchmark experiments on DTGB, evaluating 7 popular dynamic graph learning algorithms and their variants of adapting to text attributes with LLM embeddings, along with 6 powerful large language models (LLMs).
1 code implementation • 15 Jun 2023 • Menglin Yang, Min Zhou, Rex Ying, Yankai Chen, Irwin King
To address this, we propose a simple yet effective method, hyperbolic informed embedding (HIE), by incorporating cost-free hierarchical information deduced from the hyperbolic distance of the node to origin (i. e., induced hyperbolic norm) to advance existing \hlms.
no code implementations • 8 May 2023 • Yankai Chen, Yifei Zhang, Menglin Yang, Zixing Song, Chen Ma, Irwin King
Maximizing the user-item engagement based on vectorized embeddings is a standard procedure of recent recommender models.
1 code implementation • 4 Dec 2022 • Menglin Yang, Min Zhou, Lujia Pan, Irwin King
The prevalence of tree-like structures, encompassing hierarchical structures and power law distributions, exists extensively in real-world applications, including recommendation systems, ecosystems, financial networks, social networks, etc.
no code implementations • 8 Nov 2022 • Min Zhou, Menglin Yang, Lujia Pan, Irwin King
We first give a brief introduction to graph representation learning as well as some preliminary Riemannian and hyperbolic geometry.
1 code implementation • 19 Jul 2022 • Menglin Yang, Zhihao LI, Min Zhou, Jiahong Liu, Irwin King
The results reveal that (1) tail items get more emphasis in hyperbolic space than that in Euclidean space, but there is still ample room for improvement; (2) head items receive modest attention in hyperbolic space, which could be considerably improved; (3) and nonetheless, the hyperbolic models show more competitive performance than Euclidean models.
no code implementations • 27 Apr 2022 • Jiahong Liu, Min Zhou, Philippe Fournier-Viger, Menglin Yang, Lujia Pan, Mourad Nouioua
However, there are generally two limitations that hinder their practical use: (1) they have multiple parameters that are hard to set but greatly influence results, (2) and they generally focus on identifying complex subgraphs while ignoring relationships between attributes of nodes. Graphs are a popular data type found in many domains.
1 code implementation • 18 Apr 2022 • Bisheng Li, Min Zhou, Shengzhong Zhang, Menglin Yang, Defu Lian, Zengfeng Huang
Regarding missing link inference of diverse networks, we revisit the link prediction techniques and identify the importance of both the structural and attribute information.
1 code implementation • 18 Apr 2022 • Menglin Yang, Min Zhou, Jiahong Liu, Defu Lian, Irwin King
Hyperbolic space offers a spacious room to learn embeddings with its negative curvature and metric properties, which can well fit data with tree-like structures.
1 code implementation • 16 Apr 2022 • Min Zhou, Bisheng Li, Menglin Yang, Lujia Pan
Link prediction is a key problem for network-structured data, attracting considerable research efforts owing to its diverse applications.
1 code implementation • 28 Feb 2022 • Menglin Yang, Min Zhou, Zhihao LI, Jiahong Liu, Lujia Pan, Hui Xiong, Irwin King
Graph neural networks generalize conventional neural networks to graph-structured data and have received widespread attention due to their impressive representation ability.
no code implementations • 21 Jan 2022 • Jiahong Liu, Menglin Yang, Min Zhou, Shanshan Feng, Philippe Fournier-Viger
Inspired by the recently active and emerging self-supervised learning, in this study, we attempt to enhance the representation power of hyperbolic graph models by drawing upon the advantages of contrastive learning.
no code implementations • 14 Aug 2021 • Yankai Chen, Menglin Yang, Yingxue Zhang, Mengchen Zhao, Ziqiao Meng, Jianye Hao, Irwin King
Aiming to alleviate data sparsity and cold-start problems of traditional recommender systems, incorporating knowledge graphs (KGs) to supplement auxiliary information has recently gained considerable attention.
1 code implementation • 8 Jul 2021 • Menglin Yang, Min Zhou, Marcus Kalander, Zengfeng Huang, Irwin King
To explore these properties of a complex temporal network, we propose a hyperbolic temporal graph network (HTGN) that fully takes advantage of the exponential capacity and hierarchical awareness of hyperbolic geometry.
1 code implementation • 27 Feb 2021 • Menglin Yang, Ziqiao Meng, Irwin King
As a matter of fact, this smoothing technique can not only encourage must-link node pairs to get closer but also push cannot-link pairs to shrink together, which potentially cause serious feature shrink or oversmoothing problem, especially when stacking graph convolution in multiple layers or steps.