Search Results for author: Zimu Li

Found 7 papers, 3 papers with code

Technical Report: The Graph Spectral Token -- Enhancing Graph Transformers with Spectral Information

2 code implementations8 Apr 2024 Zihan Pengmei, Zimu Li

Graph Transformers have emerged as a powerful alternative to Message-Passing Graph Neural Networks (MP-GNNs) to address limitations such as over-squashing of information exchange.

Inductive Bias

Transformers are efficient hierarchical chemical graph learners

1 code implementation2 Oct 2023 Zihan Pengmei, Zimu Li, Chih-chan Tien, Risi Kondor, Aaron R. Dinner

We demonstrate SubFormer on benchmarks for predicting molecular properties from chemical structures and show that it is competitive with state-of-the-art graph transformers at a fraction of the computational cost, with training times on the order of minutes on a consumer-grade graphics card.

Graph Representation Learning

R-Mixup: Riemannian Mixup for Biological Networks

no code implementations5 Jun 2023 Xuan Kan, Zimu Li, Hejie Cui, Yue Yu, ran Xu, Shaojun Yu, Zilong Zhang, Ying Guo, Carl Yang

Biological networks are commonly used in biomedical and healthcare domains to effectively model the structure of complex biological systems with interactions linking biological entities.

Data Augmentation

Unifying O(3) Equivariant Neural Networks Design with Tensor-Network Formalism

no code implementations14 Nov 2022 Zimu Li, Zihan Pengmei, Han Zheng, Erik Thiede, Junyu Liu, Risi Kondor

Equivariant graph neural networks are a standard approach to such problems, with one of the most successful methods employing tensor products between various tensors that transform under the spatial group.

Tensor Networks

On the Super-exponential Quantum Speedup of Equivariant Quantum Machine Learning Algorithms with SU($d$) Symmetry

no code implementations15 Jul 2022 Han Zheng, Zimu Li, Junyu Liu, Sergii Strelchuk, Risi Kondor

We introduce a framework of the equivariant convolutional algorithms which is tailored for a number of machine-learning tasks on physical systems with arbitrary SU($d$) symmetries.

BIG-bench Machine Learning Quantum Machine Learning

Speeding up Learning Quantum States through Group Equivariant Convolutional Quantum Ansätze

1 code implementation14 Dec 2021 Han Zheng, Zimu Li, Junyu Liu, Sergii Strelchuk, Risi Kondor

We develop a theoretical framework for $S_n$-equivariant convolutional quantum circuits with SU$(d)$-symmetry, building on and significantly generalizing Jordan's Permutational Quantum Computing (PQC) formalism based on Schur-Weyl duality connecting both SU$(d)$ and $S_n$ actions on qudits.

BIG-bench Machine Learning Quantum Machine Learning

Towards a variational Jordan-Lee-Preskill quantum algorithm

no code implementations12 Sep 2021 Junyu Liu, Zimu Li, Han Zheng, Xiao Yuan, Jinzhao Sun

Rapid developments of quantum information technology show promising opportunities for simulating quantum field theory in near-term quantum devices.

Computational Efficiency

Cannot find the paper you are looking for? You can Submit a new open access paper.