Search Results for author: Yuhua Li

Found 6 papers, 4 papers with code

A Survey on Self-Supervised Pre-Training of Graph Foundation Models: A Knowledge-Based Perspective

1 code implementation24 Mar 2024 Ziwen Zhao, Yuhua Li, Yixiong Zou, Ruixuan Li, Rui Zhang

Graph self-supervised learning is now a go-to method for pre-training graph foundation models, including graph neural networks, graph transformers, and more recent large language model (LLM)-based graph models.

Language Modelling Large Language Model +1

Flatten Long-Range Loss Landscapes for Cross-Domain Few-Shot Learning

no code implementations1 Mar 2024 Yixiong Zou, Yicong Liu, Yiman Hu, Yuhua Li, Ruixuan Li

To enhance the transferability and facilitate fine-tuning, we introduce a simple yet effective approach to achieve long-range flattening of the minima in the loss landscape.

cross-domain few-shot learning

Masked Graph Autoencoder with Non-discrete Bandwidths

1 code implementation6 Feb 2024 Ziwen Zhao, Yuhua Li, Yixiong Zou, Jiliang Tang, Ruixuan Li

Inspired by these understandings, we explore non-discrete edge masks, which are sampled from a continuous and dispersive probability distribution instead of the discrete Bernoulli distribution.

Blocking Link Prediction +2

CSGCL: Community-Strength-Enhanced Graph Contrastive Learning

1 code implementation8 May 2023 Han Chen, Ziwen Zhao, Yuhua Li, Yixiong Zou, Ruixuan Li, Rui Zhang

Graph Contrastive Learning (GCL) is an effective way to learn generalized graph representations in a self-supervised manner, and has grown rapidly in recent years.

Attribute Contrastive Learning +3

Margin-Based Few-Shot Class-Incremental Learning with Class-Level Overfitting Mitigation

1 code implementation10 Oct 2022 Yixiong Zou, Shanghang Zhang, Yuhua Li, Ruixuan Li

Few-shot class-incremental learning (FSCIL) is designed to incrementally recognize novel classes with only few training samples after the (pre-)training on base classes with sufficient samples, which focuses on both base-class performance and novel-class generalization.

Few-Shot Class-Incremental Learning Incremental Learning

Gradient Scheduling with Global Momentum for Non-IID Data Distributed Asynchronous Training

no code implementations21 Feb 2019 Chengjie Li, Ruixuan Li, Haozhao Wang, Yuhua Li, Pan Zhou, Song Guo, Keqin Li

Distributed asynchronous offline training has received widespread attention in recent years because of its high performance on large-scale data and complex models.

Scheduling

Cannot find the paper you are looking for? You can Submit a new open access paper.