no code implementations • 21 Dec 2023 • YiXuan Wang, Shuangyin Li, Shimin Di, Lei Chen
The single-cell RNA sequencing (scRNA-seq) technology enables researchers to study complex biological systems and diseases with high resolution.
no code implementations • 17 Dec 2023 • Jiachuan Wang, Shimin Di, Lei Chen, Charles Wang Wai Ng
We validate our conjecture that monosemanticity brings about performance change at different model scales on a variety of neural networks and benchmark datasets in different areas, including language, image, and physics simulation tasks.
1 code implementation • 14 Aug 2023 • Zhili Wang, Shimin Di, Lei Chen, Xiaofang Zhou
Given a pre-trained GNN, we propose to search to fine-tune pre-trained graph neural networks for graph-level tasks (S2PGNN), which adaptively design a suitable fine-tuning framework for the given labeled data on the downstream task.
1 code implementation • ICCV 2023 • Jiachuan Wang, Shimin Di, Lei Chen, Charles Wang Wai Ng
However, such a method is highly sensitive to the standard deviation \sigma_n of noises injected to clean images, where \sigma_n is inaccessible without knowing clean images.
1 code implementation • NeurIPS 2021 • Zhili Wang, Shimin Di, Lei Chen
However, existing AutoGNN works mainly adopt an implicit way to model and leverage the link information in the graphs, which is not well regularized to the link prediction task on graphs, and limits the performance of AutoGNN for other graph tasks.
no code implementations • 29 Sep 2021 • Shimin Di, Lei Chen
In this paper, we first unify a search space of message functions that enables both structures and operators to be searchable.
3 code implementations • 22 Apr 2021 • Shimin Di, Quanming Yao, Yongqi Zhang, Lei Chen
The scoring function, which measures the plausibility of triplets in knowledge graphs (KGs), is the key to ensure the excellent performance of KG embedding, and its design is also an important problem in the literature.
1 code implementation • 21 Apr 2021 • Shimin Di, Quanming Yao, Lei Chen
Recently, tensor decomposition methods have been introduced into N-ary relational data and become state-of-the-art on embedding learning.