no code implementations • 5 Jan 2024 • Luana Ruiz, Luiz F. O. Chamon, Alejandro Ribeiro
This technical note addresses an issue [arXiv:2310. 14683] with the proof (but not the statement) of [arXiv:2003. 05030, Proposition 4].
no code implementations • 17 Nov 2023 • Thien Le, Luana Ruiz, Stefanie Jegelka
We prove a Poincar\'e inequality for graphon signals and show that complements of node subsets satisfying this inequality are unique sampling sets for Paley-Wiener spaces of graphon signals.
no code implementations • 17 Oct 2023 • Yeganeh Alimohammadi, Luana Ruiz, Amin Saberi
We propose a theoretical framework for training Graph Neural Networks (GNNs) on large input graphs via training on small, fixed-size sampled subgraphs.
no code implementations • 29 May 2023 • Zhiyang Wang, Luana Ruiz, Alejandro Ribeiro
This paper studies the relationship between a graph neural network (GNN) and a manifold neural network (MNN) when the graph is constructed from a set of points sampled from the manifold, thus encoding geometric information.
no code implementations • 25 Jan 2023 • Sanjukta Krishnagopal, Luana Ruiz
We use graphons to define limit objects -- graphon NNs for GNNs, and graphon NTKs for GNTKs -- , and prove that, on a sequence of graphs, the GNTKs converge to the graphon NTK.
no code implementations • 20 Nov 2022 • Zhiyang Wang, Luana Ruiz, Alejandro Ribeiro
The increasing availability of geometric data has motivated the need for information processing over non-Euclidean domains modeled as manifolds.
1 code implementation • 6 Nov 2022 • Luana Ruiz, Ningyuan Huang, Soledad Villar
In this work we propose a random graph model that can produce graphs at different levels of sparsity.
no code implementations • 27 Oct 2022 • Juan Cervino, Luana Ruiz, Alejandro Ribeiro
In this paper, we propose to learn GNNs on very large graphs by leveraging the limit object of a sequence of growing graphs, the graphon.
no code implementations • 1 Oct 2022 • Zhiyang Wang, Luana Ruiz, Alejandro Ribeiro
Deep neural network architectures have been proved as a powerful technique to solve problems based on these data residing on the manifold.
no code implementations • 9 Dec 2021 • Luana Ruiz, Luiz F. O. Chamon, Alejandro Ribeiro
In this paper, we study the problem of training GNNs on graphs of moderate size and transferring them to large-scale graphs.
no code implementations • 10 Oct 2021 • Zhiyang Wang, Luana Ruiz, Alejandro Ribeiro
Hence, in this paper, we analyze the stability properties of convolutional neural networks on manifolds to understand the stability of GNNs on large graphs.
no code implementations • 10 Oct 2021 • Zhiyang Wang, Luana Ruiz, Mark Eisen, Alejandro Ribeiro
We consider the problem of resource allocation in large scale wireless networks.
no code implementations • 8 Oct 2021 • Luana Ruiz, Joshua Ainslie, Santiago Ontañón
Deep learning models generalize well to in-distribution data but struggle to generalize compositionally, i. e., to combine a set of learned primitives to solve more complex tasks.
no code implementations • 7 Oct 2021 • Juan Cervino, Luana Ruiz, Alejandro Ribeiro
Graph Neural Networks (GNN) rely on graph convolutions to learn features from network data.
no code implementations • 7 Jun 2021 • Zhiyang Wang, Luana Ruiz, Alejandro Ribeiro
The most important practical consequence of this analysis is to shed light on the behavior of graph filters and GNNs in large-scale graphs.
no code implementations • 7 Jun 2021 • Juan Cervino, Luana Ruiz, Alejandro Ribeiro
Graph neural networks (GNNs) use graph convolutions to exploit network invariances and learn meaningful feature representations from network data.
no code implementations • 3 Mar 2021 • Zhiyang Wang, Luana Ruiz, Alejandro Ribeiro
We further construct a manifold neural network architecture with these filters.
no code implementations • 27 Oct 2020 • Luana Ruiz, Fernando Gama, Alejandro Ribeiro, Elvin Isufi
In this work, we approach GCNNs from a state-space perspective revealing that the graph convolutional module is a minimalistic linear state-space model, in which the state update matrix is the graph shift operator.
no code implementations • 23 Oct 2020 • Luana Ruiz, Zhiyang Wang, Alejandro Ribeiro
We then extend this analysis by interpreting the graphon neural network as a generating model for GNNs on deterministic and stochastic graphs instantiated from the original and perturbed graphons.
1 code implementation • 14 Sep 2020 • Bianca Iancu, Luana Ruiz, Alejandro Ribeiro, Elvin Isufi
Activation functions are crucial in graph neural networks (GNNs) as they allow defining a nonlinear family of functions to capture the relationship between the input graph data and their representations.
no code implementations • 4 Aug 2020 • Luana Ruiz, Fernando Gama, Alejandro Ribeiro
They are presented here as generalizations of convolutional neural networks (CNNs) in which individual layers contain banks of graph convolutional filters instead of banks of classical convolutional filters.
no code implementations • NeurIPS 2020 • Luana Ruiz, Luiz. F. O. Chamon, Alejandro Ribeiro
These graph convolutions combine information from adjacent nodes using coefficients that are shared across all nodes.
no code implementations • 10 Mar 2020 • Luana Ruiz, Luiz F. O. Chamon, Alejandro Ribeiro
Graphons are infinite-dimensional objects that represent the limit of convergent sequences of graphs as their number of nodes goes to infinity.
no code implementations • 3 Mar 2020 • Alejandro Parada-Mayorga, Luana Ruiz, Alejandro Ribeiro
In this work, we propose a new strategy for pooling and sampling on GNNs using graphons which preserves the spectral properties of the graph.
1 code implementation • 3 Feb 2020 • Luana Ruiz, Fernando Gama, Alejandro Ribeiro
Graph processes exhibit a temporal structure determined by the sequence index and and a spatial structure determined by the graph support.
no code implementations • 29 Mar 2019 • Luana Ruiz, Fernando Gama, Antonio G. Marques, Alejandro Ribeiro
Graph neural networks (GNNs) are information processing architectures tailored to these graph signals and made of stacked layers that compose graph convolutional filters with nonlinear activation functions.
1 code implementation • 5 Mar 2019 • Luana Ruiz, Fernando Gama, Alejandro Ribeiro
Graph processes model a number of important problems such as identifying the epicenter of an earthquake or predicting weather.
Ranked #11 on Node Classification on CiteSeer (0.5%)
no code implementations • 29 Oct 2018 • Luana Ruiz, Fernando Gama, Antonio G. Marques, Alejandro Ribeiro
Graph neural networks (GNNs) have been shown to replicate convolutional neural networks' (CNNs) superior performance in many problems involving graphs.