no code implementations • 9 Oct 2024 • Hamed Sajadinia, Ali Aghdaei, Zhuo Feng
State-of-the-art hypergraph partitioners utilize a multilevel paradigm to construct progressively coarser hypergraphs across multiple layers, guiding cut refinements at each level of the hierarchy.
no code implementations • 10 Jul 2024 • John Anticev, Ali Aghdaei, Wuxinlin Cheng, Zhuo Feng
SGM-PINN is a graph-based importance sampling framework to improve the training efficacy of Physics-Informed Neural Networks (PINNs) on parameterized problems.
no code implementations • 15 Jun 2024 • Soumen Sikder Shuvo, Ali Aghdaei, Zhuo Feng
This paper presents a spectral framework for assessing the generalization and stability of Graph Neural Networks (GNNs) by introducing a Graph Geodesic Distance (GGD) metric.
no code implementations • 27 Feb 2024 • Corby Rosset, Ho-Lam Chung, Guanghui Qin, Ethan C. Chau, Zhuo Feng, Ahmed Awadallah, Jennifer Neville, Nikhil Rao
We show that users spend a lot of ``effort'' on these questions in terms of signals like clicks and session length, and that they are also challenging for GPT-4.
1 code implementation • 26 Feb 2024 • Ali Aghdaei, Zhuo Feng
This work presents inGRASS, a novel algorithm designed for incremental spectral sparsification of large undirected graphs.
no code implementations • 13 Feb 2024 • Wuxinlin Cheng, Chenhui Deng, Ali Aghdaei, Zhiru Zhang, Zhuo Feng
Modern graph neural networks (GNNs) can be sensitive to changes in the input graph structure and node features, potentially resulting in unpredictable behavior and degraded performance.
no code implementations • 5 Jan 2024 • Xiaoxue Han, Zhuo Feng, Yue Ning
Continual learning on graphs tackles the problem of training a graph neural network (GNN) where graph data arrive in a streaming fashion and the model tends to forget knowledge from previous tasks when updating with new data.
no code implementations • 9 Feb 2023 • Ying Zhang, Zhiqiang Zhao, Zhuo Feng
This work introduces a highly-scalable spectral graph densification framework (SGL) for learning resistor networks with linear measurements, such as node voltages and currents.
1 code implementation • 26 Oct 2022 • Ali Aghdaei, Zhuo Feng
This paper introduces a scalable algorithmic framework (HyperEF) for spectral coarsening (decomposition) of large-scale hypergraphs by exploiting hyperedge effective resistances.
1 code implementation • 30 Jan 2022 • Chenhui Deng, Xiuyu Li, Zhuo Feng, Zhiru Zhang
Graph neural networks (GNNs) have been increasingly deployed in various applications that involve learning on non-Euclidean data.
no code implementations • 29 Sep 2021 • Chenhui Deng, Xiuyu Li, Zhuo Feng, Zhiru Zhang
In this paper, we propose GARNET, a scalable spectral method to boost the adversarial robustness of GNN models for both homophilic and heterophilic graphs.
1 code implementation • 17 Aug 2021 • Ali Aghdaei, Zhiqiang Zhao, Zhuo Feng
To address the ever-increasing computational challenges, graph coarsening can be potentially applied for preprocessing a given hypergraph by aggressively aggregating its vertices (nodes).
no code implementations • 16 Apr 2021 • Zhuo Feng
Through extensive experiments for a variety of real-world test cases, we show that the proposed approach is highly scalable for learning ultra-sparse resistor networks without sacrificing solution quality.
2 code implementations • 7 Feb 2021 • Wuxinlin Cheng, Chenhui Deng, Zhiqiang Zhao, Yaohui Cai, Zhiru Zhang, Zhuo Feng
A black-box spectral method is introduced for evaluating the adversarial robustness of a given machine learning (ML) model.
no code implementations • 1 Jan 2021 • Ying Zhang, Zhiqiang Zhao, Zhuo Feng
For the first time, we prove the existence of linear-sized spectral sparsifiers for general directed graphs and introduce a practically-efficient and unified spectral graph sparsification approach that allows sparsifying real-world, large-scale directed and undirected graphs with guaranteed preservation of the original graph spectra.
no code implementations • 1 Jan 2021 • Zhuo Feng, Yongyu Wang, Zhiqiang Zhao
Graph learning plays important role in many data mining and machine learning tasks, such as manifold learning, data representation and analysis, dimensionality reduction, data clustering, and visualization, etc.
no code implementations • 17 Aug 2020 • Ying Zhang, Zhiqiang Zhao, Zhuo Feng
Recent spectral graph sparsification techniques have shown promising performance in accelerating many numerical and graph algorithms, such as iterative methods for solving large sparse matrices, spectral partitioning of undirected graphs, vectorless verification of power/thermal grids, representation learning of large graphs, etc.
no code implementations • 23 Nov 2019 • Yongyu Wang, Zhiqiang Zhao, Zhuo Feng
Learning meaningful graphs from data plays important roles in many data mining and machine learning tasks, such as data representation and analysis, dimension reduction, data clustering, and visualization, etc.
1 code implementation • ICLR 2020 • Chenhui Deng, Zhiqiang Zhao, Yongyu Wang, Zhiru Zhang, Zhuo Feng
GraphZoom first performs graph fusion to generate a new graph that effectively encodes the topology of the original graph and the node attribute information.
no code implementations • ACM 2017 • Zhuo Feng
In recent years, spectral graph sparsification techniques that can compute ultra-sparse graph proxies have been extensively studied for accelerating various numerical and graph-related applications.
no code implementations • 12 Oct 2017 • Yongyu Wang, Zhuo Feng
The eigendeomposition of nearest-neighbor (NN) graph Laplacian matrices is the main computational bottleneck in spectral clustering.