1 code implementation • 4 Jun 2024 • Charlie B. Tan, Inés García-Redondo, Qiquan Wang, Michael M. Bronstein, Anthea Monod
Our work forms a basis for a deeper investigation of the causal relationships between fractal geometry, topological data analysis, and neural network optimization.
1 code implementation • 23 May 2024 • T. Konstantin Rusch, Nathan Kirk, Michael M. Bronstein, Christiane Lemieux, Daniela Rus
In fact, MPMC points are empirically shown to be either optimal or near-optimal with respect to the discrepancy for low dimension and small number of points, i. e., for which the optimal discrepancy can be determined.
1 code implementation • 22 May 2024 • Nian Liu, Xiaoxin He, Thomas Laurent, Francesco Di Giovanni, Michael M. Bronstein, Xavier Bresson
Spectral graph convolution, an important tool of data filtering on graphs, relies on two essential decisions; selecting spectral bases for signal transformation and parameterizing the kernel for frequency analysis.
no code implementations • 13 Feb 2024 • Chen Lin, Liheng Ma, Yiyang Chen, Wanli Ouyang, Michael M. Bronstein, Philip H. S. Torr
\textbf{Secondly}, we propose the {\em Continuous Unified Ricci Curvature} (\textbf{CURC}), an extension of celebrated {\em Ollivier-Ricci Curvature} for directed and weighted graphs.
no code implementations • 6 Feb 2024 • Xingyue Huang, Miguel Romero Orth, Pablo Barceló, Michael M. Bronstein, İsmail İlkan Ceylan
Link prediction with knowledge graphs has been thoroughly studied in graph machine learning, leading to a rich landscape of graph neural network architectures with successful applications.
no code implementations • 9 Jan 2024 • Gbètondji J-S Dovonon, Michael M. Bronstein, Matt J. Kusner
We verify this empirically and observe that the sign of layer normalization weights can influence this effect.
no code implementations • 6 Jun 2023 • Francesco Di Giovanni, T. Konstantin Rusch, Michael M. Bronstein, Andreea Deac, Marc Lackenby, Siddhartha Mishra, Petar Veličković
In this paper, we provide a rigorous analysis to determine which function classes of node features can be learned by an MPNN of a given capacity.
no code implementations • 20 Mar 2023 • T. Konstantin Rusch, Michael M. Bronstein, Siddhartha Mishra
Node features of graph neural networks (GNNs) tend to become more similar with the increase of the network depth.
1 code implementation • 2 Oct 2022 • T. Konstantin Rusch, Benjamin P. Chamberlain, Michael W. Mahoney, Michael M. Bronstein, Siddhartha Mishra
We present Gradient Gating (G$^2$), a novel framework for improving the performance of Graph Neural Networks (GNNs).
Ranked #3 on
Node Classification
on arXiv-year
1 code implementation • 30 Sep 2022 • Benjamin Paul Chamberlain, Sergey Shirobokov, Emanuele Rossi, Fabrizio Frasca, Thomas Markovich, Nils Hammerla, Michael M. Bronstein, Max Hansmire
Our experiments show that BUDDY also outperforms SGNNs on standard LP benchmarks while being highly scalable and faster than ELPH.
2 code implementations • 22 Jun 2022 • Fabrizio Frasca, Beatrice Bevilacqua, Michael M. Bronstein, Haggai Maron
Subgraph GNNs are a recent class of expressive Graph Neural Networks (GNNs) which model graphs as collections of subgraphs.
2 code implementations • 22 Jun 2022 • Francesco Di Giovanni, James Rowbottom, Benjamin P. Chamberlain, Thomas Markovich, Michael M. Bronstein
We do so by showing that linear graph convolutions with symmetric weights minimize a multi-particle energy that generalizes the Dirichlet energy; in this setting, the weight matrices induce edge-wise attraction (repulsion) through their positive (negative) eigenvalues, thereby controlling whether the features are being smoothed or sharpened.
no code implementations • 16 Jun 2022 • Emanuele Rossi, Federico Monti, Yan Leng, Michael M. Bronstein, Xiaowen Dong
We adopt a transformer-like architecture which correctly accounts for the symmetries of the problem and learns a mapping from the equilibrium actions to the network structure of the game without explicit knowledge of the utility function.
1 code implementation • 30 Apr 2022 • Ahmed A. A. Elhag, Gabriele Corso, Hannes Stärk, Michael M. Bronstein
Traditional Graph Neural Networks (GNNs) rely on message passing, which amounts to permutation-invariant local aggregation of neighbour features.
no code implementations • 1 Apr 2022 • Kamilia Mullakaeva, Luca Cosmo, Anees Kazi, Seyed-Ahmad Ahmadi, Nassir Navab, Michael M. Bronstein
In this work, we propose Graph-in-Graph (GiG), a neural network architecture for protein classification and brain imaging applications that exploits the graph representation of the input data samples and their latent relation.
1 code implementation • 9 Feb 2022 • Cristian Bodnar, Francesco Di Giovanni, Benjamin Paul Chamberlain, Pietro Liò, Michael M. Bronstein
In this paper, we use cellular sheaf theory to show that the underlying geometry of the graph is deeply linked with the performance of GNNs in heterophilic settings and their oversmoothing behaviour.
1 code implementation • 7 Feb 2022 • Paul Bertin, Jarrid Rector-Brooks, Deepak Sharma, Thomas Gaudelet, Andrew Anighoro, Torsten Gross, Francisco Martinez-Pena, Eileen L. Tang, Suraj M S, Cristian Regep, Jeremy Hayter, Maksym Korablyov, Nicholas Valiante, Almer van der Sloot, Mike Tyers, Charles Roberts, Michael M. Bronstein, Luke L. Lairson, Jake P. Taylor-King, Yoshua Bengio
For large libraries of small molecules, exhaustive combinatorial chemical screens become infeasible to perform when considering a range of disease models, assay conditions, and dose ranges.
1 code implementation • 4 Feb 2022 • T. Konstantin Rusch, Benjamin P. Chamberlain, James Rowbottom, Siddhartha Mishra, Michael M. Bronstein
This demonstrates that the proposed framework mitigates the oversmoothing problem.
3 code implementations • ICLR 2022 • Jake Topping, Francesco Di Giovanni, Benjamin Paul Chamberlain, Xiaowen Dong, Michael M. Bronstein
Most graph neural networks (GNNs) use the message passing paradigm, in which node features are propagated on the input graph.
Ranked #41 on
Node Classification
on Citeseer
1 code implementation • ICLR 2022 • Beatrice Bevilacqua, Fabrizio Frasca, Derek Lim, Balasubramaniam Srinivasan, Chen Cai, Gopinath Balamurugan, Michael M. Bronstein, Haggai Maron
Thus, we propose to represent each graph as a set of subgraphs derived by some predefined policy, and to process it using a suitable equivariant architecture.
no code implementations • 29 Sep 2021 • Emanuele Rossi, Federico Monti, Yan Leng, Michael M. Bronstein, Xiaowen Dong
Strategic interactions between a group of individuals or organisations can be modelled as games played on networks, where a player's payoff depends not only on their actions but also on those of their neighbors.
1 code implementation • NeurIPS 2021 • Giorgos Bouritsas, Andreas Loukas, Nikolaos Karalias, Michael M. Bronstein
Can we use machine learning to compress graph data?
1 code implementation • NeurIPS Workshop DLDE 2021 • Benjamin Paul Chamberlain, James Rowbottom, Maria Gorinova, Stefan Webb, Emanuele Rossi, Michael M. Bronstein
We present Graph Neural Diffusion (GRAND) that approaches deep learning on graphs as a continuous diffusion process and treats Graph Neural Networks (GNNs) as discretisations of an underlying PDE.
1 code implementation • CVPR 2021 • Freyr Sverrisson, Jean Feydy, Bruno E. Correia, Michael M. Bronstein
These results will considerably ease the deployment of deep learning methods in protein science and open the door for end-to-end differentiable approaches in protein modeling tasks such as function prediction and design.
6 code implementations • 27 Apr 2021 • Michael M. Bronstein, Joan Bruna, Taco Cohen, Petar Veličković
The last decade has witnessed an experimental revolution in data science and machine learning, epitomised by deep learning methods.
no code implementations • 17 Apr 2021 • Jacob Andreas, Gašper Beguš, Michael M. Bronstein, Roee Diamant, Denley Delaney, Shane Gero, Shafi Goldwasser, David F. Gruber, Sarah de Haas, Peter Malkin, Roger Payne, Giovanni Petri, Daniela Rus, Pratyusha Sharma, Dan Tchernov, Pernille Tønnesen, Antonio Torralba, Daniel Vogt, Robert J. Wood
We posit that machine learning will be the cornerstone of future collection, processing, and analysis of multimodal streams of data in animal communication studies, including bioacoustic, behavioral, biological, and environmental data.
no code implementations • 16 Dec 2020 • Mehdi Bahri, Eimear O' Sullivan, Shunwang Gong, Feng Liu, Xiaoming Liu, Michael M. Bronstein, Stefanos Zafeiriou
Compared to the previous state-of-the-art learning algorithms for non-rigid registration of face scans, SMF only requires the raw data to be rigidly aligned (with scaling) with a pre-defined face template.
no code implementations • 9 Dec 2020 • Thomas Gaudelet, Ben Day, Arian R. Jamasb, Jyothish Soman, Cristian Regep, Gertrude Liu, Jeremy B. R. Hayter, Richard Vickers, Charles Roberts, Jian Tang, David Roblin, Tom L. Blundell, Michael M. Bronstein, Jake P. Taylor-King
Graph Machine Learning (GML) is receiving growing interest within the pharmaceutical and biotechnology industries for its ability to model biomolecular structures, the functional relationships between them, and integrate multi-omic datasets - amongst other data types.
no code implementations • 24 Sep 2020 • Benjamin P. Chamberlain, Emanuele Rossi, Dan Shiebler, Suvash Sedhain, Michael M. Bronstein
We show that applying constrained hy-perparameter optimization using only a 10% sample of the data still yields a 91%average improvement in hit rate over the default parameters when applied to thefull datasets.
2 code implementations • 16 Jun 2020 • Giorgos Bouritsas, Fabrizio Frasca, Stefanos Zafeiriou, Michael M. Bronstein
It has been shown that the expressive power of standard GNNs is bounded by the Weisfeiler-Leman (WL) graph isomorphism test, from which they inherit proven limitations such as the inability to detect and count graph substructures.
Ranked #2 on
Graph Regression
on ZINC 100k
no code implementations • CVPR 2020 • Shunwang Gong, Mehdi Bahri, Michael M. Bronstein, Stefanos Zafeiriou
Graph convolution operators bring the advantages of deep learning to a variety of graph and mesh processing tasks previously deemed out of reach.
1 code implementation • 14 Sep 2019 • Fabrizio Frasca, Diego Galeano, Guadalupe Gonzalez, Ivan Laponogov, Kirill Veselkov, Alberto Paccanaro, Michael M. Bronstein
Here, we propose an interpretable model that learns disease self-representations for drug repositioning.
no code implementations • 30 Jul 2019 • Ron Levie, Wei Huang, Lorenzo Bucci, Michael M. Bronstein, Gitta Kutyniok
Transferability, which is a certain type of generalization capability, can be loosely defined as follows: if two graphs describe the same phenomenon, then a single filter or ConvNet should have similar repercussions on both graphs.
4 code implementations • 10 Feb 2019 • Federico Monti, Fabrizio Frasca, Davide Eynard, Damon Mannion, Michael M. Bronstein
One of the main reasons is that often the interpretation of the news requires the knowledge of political or social context or 'common sense', which current NLP algorithms are still missing.
1 code implementation • CVPR 2019 • Luca Cosmo, Mikhail Panine, Arianna Rampini, Maks Ovsjanikov, Michael M. Bronstein, Emanuele Rodolà
The question whether one can recover the shape of a geometric object from its Laplacian spectrum ('hear the shape of the drum') is a classical problem in spectral geometry with a broad range of implications and applications.
no code implementations • 19 Sep 2018 • Stefan C. Schonsheck, Michael M. Bronstein, Rongjie Lai
In this paper, we propose a variational model to align the Laplace-Beltrami (LB) eigensytems of two non-isometric genus zero shapes via conformal deformations.
1 code implementation • 17 Sep 2018 • Nicholas Choma, Federico Monti, Lisa Gerhardt, Tomasz Palczewski, Zahra Ronaghi, Prabhat, Wahid Bhimji, Michael M. Bronstein, Spencer R. Klein, Joan Bruna
Tasks involving the analysis of geometric (graph- and manifold-structured) data have recently gained prominence in the machine learning community, giving birth to a rapidly developing field of geometric deep learning.
no code implementations • 3 Jun 2018 • Federico Monti, Oleksandr Shchur, Aleksandar Bojchevski, Or Litany, Stephan Günnemann, Michael M. Bronstein
In recent years, there has been a surge of interest in developing deep learning methods for non-Euclidean structured data such as graphs.
1 code implementation • ICLR 2019 • Jan Svoboda, Jonathan Masci, Federico Monti, Michael M. Bronstein, Leonidas Guibas
Deep learning systems have become ubiquitous in many aspects of our lives.
no code implementations • 4 Feb 2018 • Federico Monti, Karl Otness, Michael M. Bronstein
Deep learning on graphs and in particular, graph convolutional neural networks, have recently attracted significant attention in the machine learning community.
21 code implementations • 24 Jan 2018 • Yue Wang, Yongbin Sun, Ziwei Liu, Sanjay E. Sarma, Michael M. Bronstein, Justin M. Solomon
Point clouds provide a flexible geometric representation suitable for countless applications in computer graphics; they also comprise the raw output of most 3D data acquisition devices.
Ranked #6 on
Point Cloud Segmentation
on PointCloud-C
no code implementations • ICLR 2018 • Ron Levie, Federico Monti, Xavier Bresson, Michael M. Bronstein
The rise of graph-structured data such as social networks, regulatory networks, citation graphs, and functional brain networks, in combination with resounding success of deep learning in various applications, has brought the interest in generalizing deep learning models to non-Euclidean domains.
1 code implementation • 11 Sep 2017 • Amit Boyarski, Alex M. Bronstein, Michael M. Bronstein
Multidimensional Scaling (MDS) is one of the most popular methods for dimensionality reduction and visualization of high dimensional data.
Computational Geometry
2 code implementations • 22 May 2017 • Ron Levie, Federico Monti, Xavier Bresson, Michael M. Bronstein
The rise of graph-structured data such as social networks, regulatory networks, citation graphs, and functional brain networks, in combination with resounding success of deep learning in various applications, has brought the interest in generalizing deep learning models to non-Euclidean domains.
no code implementations • 4 May 2017 • Jan Svoboda, Federico Monti, Michael M. Bronstein
Performance of fingerprint recognition depends heavily on the extraction of minutiae points.
3 code implementations • ICCV 2017 • Or Litany, Tal Remez, Emanuele Rodolà, Alex M. Bronstein, Michael M. Bronstein
We introduce a new framework for learning dense correspondence between deformable 3D shapes.
2 code implementations • NeurIPS 2017 • Federico Monti, Michael M. Bronstein, Xavier Bresson
Matrix completion models are among the most common formulations of recommender systems.
Ranked #5 on
Recommendation Systems
on YahooMusic Monti
(using extra training data)
4 code implementations • CVPR 2017 • Federico Monti, Davide Boscaini, Jonathan Masci, Emanuele Rodolà, Jan Svoboda, Michael M. Bronstein
Recently, there has been an increasing interest in geometric deep learning, attempting to generalize deep learning methods to non-Euclidean structured data such as graphs and manifolds, with a variety of applications from the domains of network analysis, computational social science, or computer graphics.
Ranked #4 on
Document Classification
on Cora
no code implementations • 24 Nov 2016 • Michael M. Bronstein, Joan Bruna, Yann Lecun, Arthur Szlam, Pierre Vandergheynst
In many applications, such geometric data are large and complex (in the case of social networks, on the scale of billions), and are natural targets for machine learning techniques.
no code implementations • NeurIPS 2016 • Davide Boscaini, Jonathan Masci, Emanuele Rodolà, Michael M. Bronstein
Establishing correspondence between shapes is a fundamental problem in geometry processing, arising in a wide variety of applications.
no code implementations • CVPR 2016 • Zorah Lähner, Emanuele Rodolà, Frank R. Schmidt, Michael M. Bronstein, Daniel Cremers
We propose the first algorithm for non-rigid 2D-to-3D shape matching, where the input is a 2D shape represented as a planar curve and a 3D shape represented as a surface; the output is a continuous curve on the surface.
1 code implementation • 17 Jun 2015 • Emanuele Rodolà, Luca Cosmo, Michael M. Bronstein, Andrea Torsello, Daniel Cremers
In this paper, we propose a method for computing partial functional correspondence between non-rigid shapes.
no code implementations • 26 Jan 2015 • Jonathan Masci, Davide Boscaini, Michael M. Bronstein, Pierre Vandergheynst
Feature descriptors play a crucial role in a wide range of geometry analysis and processing applications, including shape correspondence, retrieval, and segmentation.
no code implementations • CVPR 2015 • Artiom Kovnatsky, Michael M. Bronstein, Xavier Bresson, Pierre Vandergheynst
In this paper, we consider the problem of finding dense intrinsic correspondence between manifolds using the recently introduced functional framework.
no code implementations • 7 Jun 2014 • Davide Boscaini, Davide Eynard, Michael M. Bronstein
Shape-from-X is an important class of problems in the fields of geometry processing, computer graphics, and vision, attempting to recover the structure of a shape from some observations.
no code implementations • 19 Dec 2013 • Jonathan Masci, Alex M. Bronstein, Michael M. Bronstein, Pablo Sprechmann, Guillermo Sapiro
In recent years, a lot of attention has been devoted to efficient nearest neighbor search by means of similarity-preserving hashing.
no code implementations • 11 Dec 2013 • Michael M. Bronstein, Klaus Glashoff
In this paper, we introduce heat kernel coupling (HKC) as a method of constructing multimodal spectral geometry on weighted graphs of different size without vertex-wise bijective correspondence.
no code implementations • 1 Nov 2013 • Davide Eynard, Artiom Kovnatsky, Michael M. Bronstein
Mappings between color spaces are ubiquitous in image processing problems such as gamut mapping, decolorization, and image optimization for color-blind people.
no code implementations • 19 Jul 2013 • Michael M. Bronstein, Klaus Glashoff, Terry A. Loring
In this paper, we construct multimodal spectral geometry by finding a pair of closest commuting operators (CCO) to a given pair of Laplacians.
no code implementations • CIS 2010 • Alexander M. Bronstein, Michael M. Bronstein, Leonidas J. Guibas, and Maks Ovsjanikov
Similarity-sensitive hashing seeks compact representation of vector data as binary codes, so that the Hamming distance between code words approximates the original similarity.