no code implementations • 7 Feb 2024 • Pedro Vianna, Muawiz Chaudhary, Paria Mehrbod, An Tang, Guy Cloutier, Guy Wolf, Michael Eickenberg, Eugene Belilovsky
However, in many practical applications this technique is vulnerable to label distribution shifts, sometimes producing catastrophic failure.
no code implementations • 6 Feb 2024 • Chenqing Hua, Connor Coley, Guy Wolf, Doina Precup, Shuangjia Zheng
Protein-protein interactions (PPIs) are crucial in regulating numerous cellular functions, including signal transduction, transportation, and immune defense.
no code implementations • 4 Dec 2023 • Danqi Liao, Chen Liu, Benjamin W. Christensen, Alexander Tong, Guillaume Huguet, Guy Wolf, Maximilian Nickel, Ian Adelstein, Smita Krishnaswamy
Entropy and mutual information in neural networks provide rich information on the learning process, but they have proven difficult to compute reliably in high dimensions.
no code implementations • 1 Dec 2023 • Sacha Morin, Somjit Nath, Samira Ebrahimi Kahou, Guy Wolf
This work is concerned with the temporal contrastive learning (TCL) setting where the sequential structure of the data is used instead to define positive pairs, which is more commonly used in RL and robotics contexts.
1 code implementation • 6 Oct 2023 • Dominique Beaini, Shenyang Huang, Joao Alex Cunha, Zhiyi Li, Gabriela Moisescu-Pareja, Oleksandr Dymov, Samuel Maddrell-Mander, Callum McLean, Frederik Wenkel, Luis Müller, Jama Hussein Mohamud, Ali Parviz, Michael Craig, Michał Koziarski, Jiarui Lu, Zhaocheng Zhu, Cristian Gabellini, Kerstin Klaser, Josef Dean, Cas Wognum, Maciej Sypetkowski, Guillaume Rabusseau, Reihaneh Rabbany, Jian Tang, Christopher Morris, Ioannis Koutis, Mirco Ravanelli, Guy Wolf, Prudencio Tossou, Hadrien Mary, Therence Bois, Andrew Fitzgibbon, Błażej Banaszewski, Chad Martin, Dominic Masters
Recently, pre-trained foundation models have enabled significant advancements in multiple fields.
no code implementations • 18 Sep 2023 • Dhananjay Bhaskar, Yanlei Zhang, Charles Xu, Xingzhi Sun, Oluwadamilola Fasina, Guy Wolf, Maximilian Nickel, Michael Perlmutter, Smita Krishnaswamy
In this paper, we propose Graph Differential Equation Network (GDeNet), an approach that harnesses the expressive power of solutions to PDEs on a graph to obtain continuous node- and graph-level representations for various downstream tasks.
1 code implementation • 14 Jul 2023 • Renming Liu, Semih Cantürk, Olivier Lapointe-Gagné, Vincent Létourneau, Guy Wolf, Dominique Beaini, Ladislav Rampášek
Positional and structural encodings (PSE) enable better identifiability of nodes within a graph, as in general graphs lack a canonical node ordering.
1 code implementation • 7 Jul 2023 • Alexander Tong, Nikolay Malkin, Kilian Fatras, Lazar Atanackovic, Yanlei Zhang, Guillaume Huguet, Guy Wolf, Yoshua Bengio
We present simulation-free score and flow matching ([SF]$^2$M), a simulation-free objective for inferring stochastic dynamics given unpaired samples drawn from arbitrary source and target distributions.
no code implementations • 13 Jun 2023 • Dhananjay Bhaskar, Sumner Magruder, Edward De Brouwer, Aarthi Venkat, Frederik Wenkel, Guy Wolf, Smita Krishnaswamy
Complex systems are characterized by intricate interactions between entities that evolve dynamically over time.
no code implementations • 5 Jun 2023 • Samuel Leone, Aarthi Venkat, Guillaume Huguet, Alexander Tong, Guy Wolf, Smita Krishnaswamy
GFMMD is defined via an optimal witness function that is both smooth on the graph and maximizes difference in expectation between the pair of distributions on the graph.
1 code implementation • 1 Jun 2023 • Oluwadamilola Fasina, Guillaume Huguet, Alexander Tong, Yanlei Zhang, Guy Wolf, Maximilian Nickel, Ian Adelstein, Smita Krishnaswamy
Although data diffusion embeddings are ubiquitous in unsupervised learning and have proven to be a viable technique for uncovering the underlying intrinsic geometry of data, diffusion embeddings are inherently limited due to their discrete nature.
no code implementations • International Conference on Machine Learning Workshop on TAGML 2023 • Danqi Liao*, Chen Liu*, Alexander Tong, Guillaume Huguet, Guy Wolf, Maximilian Nickel, Ian Adelstein, Smita Krishnaswamy
We also see that there is an increase in DSMI with the class label over time.
1 code implementation • NeurIPS 2023 • Guillaume Huguet, Alexander Tong, Edward De Brouwer, Yanlei Zhang, Guy Wolf, Ian Adelstein, Smita Krishnaswamy
Finally, we show that parameters of our more general method can be configured to give results similar to PHATE (a state-of-the-art diffusion based manifold learning method) as well as SNE (an attraction/repulsion neighborhood based method that forms the basis of t-SNE).
2 code implementations • 1 Feb 2023 • Alexander Tong, Kilian Fatras, Nikolay Malkin, Guillaume Huguet, Yanlei Zhang, Jarrid Rector-Brooks, Guy Wolf, Yoshua Bengio
CFM features a stable regression objective like that used to train the stochastic flow in diffusion models but enjoys the efficient inference of deterministic flow models.
no code implementations • 2 Nov 2022 • Guillaume Huguet, Alexander Tong, María Ramos Zapatero, Christopher J. Tape, Guy Wolf, Smita Krishnaswamy
Efficient computation of optimal transport distance between distributions is of growing importance in data science.
no code implementations • 28 Oct 2022 • MohammadReza Davari, Stefan Horoi, Amine Natik, Guillaume Lajoie, Guy Wolf, Eugene Belilovsky
Comparing learned neural representations in neural networks is a challenging but important problem, which has been approached in different ways.
no code implementations • 23 Oct 2022 • Andres F. Duque, Myriam Lizotte, Guy Wolf, Kevin R. Moon
With this in mind, we present a novel manifold alignment method called MALI (Manifold alignment with label information) that learns a correspondence between two distinct domains.
no code implementations • 15 Aug 2022 • Alexander Tong, Frederik Wenkel, Dhananjay Bhaskar, Kincaid MacDonald, Jackson Grady, Michael Perlmutter, Smita Krishnaswamy, Guy Wolf
We propose a new graph neural network (GNN) module, based on relaxations of recently proposed geometric scattering transforms, which consist of a cascade of graph wavelet filters.
no code implementations • 29 Jun 2022 • Guillaume Huguet, D. S. Magruder, Alexander Tong, Oluwadamilola Fasina, Manik Kuchroo, Guy Wolf, Smita Krishnaswamy
In GAE the latent space distance between points is regularized to match a novel multiscale geodesic distance on the data manifold that we define.
2 code implementations • 16 Jun 2022 • Vijay Prakash Dwivedi, Ladislav Rampášek, Mikhail Galkin, Ali Parviz, Guy Wolf, Anh Tuan Luu, Dominique Beaini
Graph Neural Networks (GNNs) that are based on the message passing (MP) paradigm generally exchange information between 1-hop neighbors to build node representations at each layer.
Ranked #3 on Link Prediction on PCQM-Contact
no code implementations • 15 Jun 2022 • Andres F. Duque, Guy Wolf, Kevin R. Moon
The integration of multimodal data presents a challenge in cases when the study of a given phenomena by different instruments or conditions generates distinct but related domains.
1 code implementation • 15 Jun 2022 • Renming Liu, Semih Cantürk, Frederik Wenkel, Sarah McGuire, Xinyi Wang, Anna Little, Leslie O'Bray, Michael Perlmutter, Bastian Rieck, Matthew Hirn, Guy Wolf, Ladislav Rampášek
Graph Neural Networks (GNNs) extend the success of neural networks to graph-structured data by accounting for their intrinsic geometry.
1 code implementation • 3 Jun 2022 • Yimeng Min, Frederik Wenkel, Michael Perlmutter, Guy Wolf
We propose a geometric scattering-based graph neural network (GNN) for approximating solutions of the NP-hard maximum clique (MC) problem.
3 code implementations • 25 May 2022 • Ladislav Rampášek, Mikhail Galkin, Vijay Prakash Dwivedi, Anh Tuan Luu, Guy Wolf, Dominique Beaini
We propose a recipe on how to build a general, powerful, scalable (GPS) graph Transformer with linear complexity and state-of-the-art results on a diverse set of benchmarks.
Ranked #1 on Graph Property Prediction on ogbg-ppa
no code implementations • 28 Mar 2022 • Guillaume Huguet, Alexander Tong, Bastian Rieck, Jessie Huang, Manik Kuchroo, Matthew Hirn, Guy Wolf, Smita Krishnaswamy
From a geometric perspective, we obtain convergence bounds based on the smallest transition probability and the radius of the data, whereas from a spectral perspective, our bounds are based on the eigenspectrum of the diffusion kernel.
no code implementations • 22 Jan 2022 • Frederik Wenkel, Yimeng Min, Matthew Hirn, Michael Perlmutter, Guy Wolf
We further introduce an attention framework that allows the model to locally attend over combined information from different filters at the node level.
no code implementations • 22 Dec 2021 • Jessie Huang, Erica L. Busch, Tom Wallenstein, Michal Gerasimiuk, Andrew Benz, Guillaume Lajoie, Guy Wolf, Nicholas B. Turk-Browne, Smita Krishnaswamy
In order to understand the connection between stimuli of interest and brain activity, and analyze differences and commonalities between subjects, it becomes important to learn a meaningful embedding of the data that denoises, and reveals its intrinsic structure.
1 code implementation • 19 Nov 2021 • Michal Gerasimiuk, Dennis Shung, Alexander Tong, Adrian Stanley, Michael Schultz, Jeffrey Ngu, Loren Laine, Guy Wolf, Smita Krishnaswamy
In particular, in EHR data, some variables are {\em missing not at random (MNAR)} but deliberately not collected and thus are a source of information.
no code implementations • 7 Nov 2021 • Guy Wolf, Gil Shabat, Hanan Shteingart
Positivity is one of the three conditions for causal inference from observational data.
no code implementations • 27 Oct 2021 • Renming Liu, Semih Cantürk, Frederik Wenkel, Dylan Sandfelder, Devin Kreuzer, Anna Little, Sarah McGuire, Leslie O'Bray, Michael Perlmutter, Bastian Rieck, Matthew Hirn, Guy Wolf, Ladislav Rampášek
Graph neural networks (GNNs) have attracted much attention due to their ability to leverage the intrinsic geometries of the underlying data.
no code implementations • 29 Sep 2021 • Shanel Gauthier, Benjamin Thérien, Laurent Alsène-Racicot, Muawiz Sajjad Chaudhary, Irina Rish, Eugene Belilovsky, Michael Eickenberg, Guy Wolf
The wavelet filters used in the scattering transform are typically selected to create a tight frame via a parameterized mother wavelet.
no code implementations • 26 Jul 2021 • Alexander Tong, Guillaume Huguet, Dennis Shung, Amine Natik, Manik Kuchroo, Guillaume Lajoie, Guy Wolf, Smita Krishnaswamy
We propose to compare and organize such datasets of graph signals by using an earth mover's distance (EMD) with a geodesic cost over the underlying graph.
1 code implementation • CVPR 2022 • Shanel Gauthier, Benjamin Thérien, Laurent Alsène-Racicot, Muawiz Chaudhary, Irina Rish, Eugene Belilovsky, Michael Eickenberg, Guy Wolf
The wavelet scattering transform creates geometric invariants and deformation stability.
1 code implementation • 15 Jul 2021 • Ladislav Rampášek, Guy Wolf
Graph neural networks (GNNs) based on message passing between neighboring nodes are known to be insufficient for capturing long-range interactions in graphs.
1 code implementation • 25 Feb 2021 • Alexander Tong, Guillaume Huguet, Amine Natik, Kincaid MacDonald, Manik Kuchroo, Ronald Coifman, Guy Wolf, Smita Krishnaswamy
Here, Diffusion EMD can derive distances between patients on the manifold of cells at least two orders of magnitude faster than equally accurate methods.
no code implementations • 12 Feb 2021 • Manik Kuchroo, Abhinav Godavarthi, Alexander Tong, Guy Wolf, Smita Krishnaswamy
We propose a method called integrated diffusion for combining multimodal datasets, or data gathered via several different measurements on the same system, to create a joint data diffusion operator.
no code implementations • 31 Jan 2021 • Stefan Horoi, Jessie Huang, Bastian Rieck, Guillaume Lajoie, Guy Wolf, Smita Krishnaswamy
This suggests that qualitative and quantitative examination of the loss landscape geometry could yield insights about neural network generalization performance during training.
no code implementations • 1 Jan 2021 • Yewen Wang, Jian Tang, Yizhou Sun, Guy Wolf
We empirically analyse our proposed DGL-GNN model, and demonstrate its effectiveness and superior efficiency through a range of experiments.
1 code implementation • 28 Oct 2020 • Yimeng Min, Frederik Wenkel, Guy Wolf
Geometric scattering has recently gained recognition in graph representation learning, and recent work has shown that integrating scattering features in graph convolution networks (GCNs) can alleviate the typical oversmoothing of features in node representation learning.
no code implementations • 6 Oct 2020 • Alexander Tong, Frederik Wenkel, Kincaid MacDonald, Smita Krishnaswamy, Guy Wolf
We propose a new graph neural network (GNN) module, based on relaxations of recently proposed geometric scattering transforms, which consist of a cascade of graph wavelet filters.
no code implementations • 14 Jul 2020 • Andrés F. Duque, Sacha Morin, Guy Wolf, Kevin R. Moon
Our regularization, based on the diffusion potential distances from the recently-proposed PHATE visualization method, encourages the learned latent representation to follow intrinsic data geometry, similar to manifold learning algorithms, while still enabling faithful extension to new data and reconstruction of data in the original feature space from latent coordinates.
no code implementations • 22 Jun 2020 • Victor Geadah, Giancarlo Kerg, Stefan Horoi, Guy Wolf, Guillaume Lajoie
Dynamic adaptation in single-neuron response plays a fundamental role in neural coding in biological neural networks.
no code implementations • 15 Jun 2020 • Jake S. Rhodes, Adele Cutler, Guy Wolf, Kevin R. Moon
We show, both qualitatively and quantitatively, the advantages of our approach in retaining local and global structures in data, while emphasizing important variables in the low-dimensional embedding.
1 code implementation • NeurIPS 2020 • Bastian Rieck, Tristan Yates, Christian Bock, Karsten Borgwardt, Guy Wolf, Nicholas Turk-Browne, Smita Krishnaswamy
We observe significant differences in both brain state trajectories and overall topological activity between adults and children watching the same movie.
2 code implementations • 12 Jun 2020 • Egbert Castro, Andrew Benz, Alexander Tong, Guy Wolf, Smita Krishnaswamy
We propose a geometric scattering autoencoder (GSAE) network for learning such graph embeddings.
1 code implementation • NeurIPS 2020 • Yimeng Min, Frederik Wenkel, Guy Wolf
Graph convolutional networks (GCNs) have shown promising results in processing graph data by extracting structure-aware features.
1 code implementation • 17 Feb 2020 • Mostafa ElAraby, Guy Wolf, Margarida Carvalho
We introduce a mixed integer program (MIP) for assigning importance scores to each neuron in deep neural network architectures which is guided by the impact of their simultaneous pruning on the main learning task of the network.
2 code implementations • ICML 2020 • Alexander Tong, Jessie Huang, Guy Wolf, David van Dijk, Smita Krishnaswamy
To address this issue, we establish a link between continuous normalizing flows and dynamic optimal transport, that allows us to model the expected paths of points over time.
no code implementations • 9 Jan 2020 • Stefan Horoi, Guillaume Lajoie, Guy Wolf
The efficiency of recurrent neural networks (RNNs) in dealing with sequential data has long been established.
1 code implementation • 14 Nov 2019 • Michael Perlmutter, Alexander Tong, Feng Gao, Guy Wolf, Matthew Hirn
As a result, the proposed construction unifies and extends known theoretical results for many of the existing graph scattering architectures.
no code implementations • 25 Sep 2019 • Matthew Amodio, David van Dijk, Ruth Montgomery, Guy Wolf, Smita Krishnaswamy
While generative neural networks can learn to transform a specific input dataset into a specific target dataset, they require having just such a paired set of input/output datasets.
1 code implementation • 10 Jul 2019 • Nathan Brugnone, Alex Gonopolskiy, Mark W. Moyle, Manik Kuchroo, David van Dijk, Kevin R. Moon, Daniel Colon-Ramos, Guy Wolf, Matthew J. Hirn, Smita Krishnaswamy
Here, we consider multiple levels of abstraction via a multiresolution geometry of data points at different granularities.
no code implementations • 25 Jun 2019 • Andrés F. Duque, Guy Wolf, Kevin R. Moon
Manifold learning techniques for dynamical systems and time series have shown their utility for a broad spectrum of applications in recent years.
no code implementations • 26 May 2019 • Alexander Tong, Guy Wolf, Smita Krishnaswamy
We show that this procedure successfully detects unseen anomalies with guarantees on those that have a certain Wasserstein distance from the data or corrupted training set.
no code implementations • 24 May 2019 • Michael Perlmutter, Feng Gao, Guy Wolf, Matthew Hirn
The Euclidean scattering transform was introduced nearly a decade ago to improve the mathematical understanding of convolutional neural networks.
no code implementations • 17 May 2019 • Aude Forcione-Lambert, Guy Wolf, Guillaume Lajoie
We investigate the learned dynamical landscape of a recurrent neural network solving a simple task requiring the interaction of two memory mechanisms: long- and short-term.
no code implementations • ICLR 2019 • Alexander Tong, David van Dijk, Jay Stanley, Guy Wolf, Smita Krishnaswamy
First, we show a synthetic example that the graph-structured layer can reveal topological features of the data.
no code implementations • ICLR 2019 • Feng Gao, Guy Wolf, Matthew Hirn
Furthermore, ConvNets inspired recent advances in geometric deep learning, which aim to generalize these networks to graph data by applying notions from graph signal processing to learn deep graph filter cascades.
no code implementations • ICLR 2019 • Jay S. Stanley III, Guy Wolf, Smita Krishnaswamy
We leverage this assumption to estimate relations between intrinsic manifold dimensions, which are given by diffusion map coordinates over each of the datasets.
no code implementations • ICLR Workshop LLD 2019 • Daniel B. Burkhardt, Jay S. Stanley III, Ana Luisa Perdigoto, Scott A. Gigante, Kevan C. Herold, Guy Wolf, Antonio J. Giraldez, David van Dijk, Smita Krishnaswamy
Single-cell RNA-sequencing (scRNA-seq) is a powerful tool for analyzing biological systems.
no code implementations • 31 Jan 2019 • Scott Gigante, Jay S. Stanley III, Ngan Vu, David van Dijk, Kevin Moon, Guy Wolf, Smita Krishnaswamy
Diffusion maps are a commonly used kernel-based method for manifold learning, which can reveal intrinsic structures in data and embed them in low dimensions.
1 code implementation • 25 Jan 2019 • David van Dijk, Daniel Burkhardt, Matthew Amodio, Alex Tong, Guy Wolf, Smita Krishnaswamy
Here, we propose a reformulation of the problem such that the goal is to learn a non-linear transformation of the data into a latent archetypal space.
no code implementations • 15 Dec 2018 • Michael Perlmutter, Guy Wolf, Matthew Hirn
The Euclidean scattering transform was introduced nearly a decade ago to improve the mathematical understanding of the success of convolutional neural networks (ConvNets) in image data analysis and other tasks.
no code implementations • NeurIPS 2018 • Ofir Lindenbaum, Jay Stanley, Guy Wolf, Smita Krishnaswamy
We propose a new type of generative model for high-dimensional data that learns a manifold geometry of the data, rather than density, and can generate points evenly along this manifold.
no code implementations • ICLR 2019 • Feng Gao, Guy Wolf, Matthew Hirn
We explore the generalization of scattering transforms from traditional (e. g., image or audio) signals to graph data, analogous to the generalization of ConvNets in geometric deep learning, and the utility of extracted graph features in graph data analysis.
1 code implementation • ICLR 2019 • Alexander Tong, David van Dijk, Jay S. Stanley III, Matthew Amodio, Kristina Yim, Rebecca Muhle, James Noonan, Guy Wolf, Smita Krishnaswamy
Taking inspiration from spatial organization and localization of neuron activations in biological networks, we use a graph Laplacian penalty to structure the activations within a layer.
no code implementations • 30 Sep 2018 • Jay S. Stanley III, Scott Gigante, Guy Wolf, Smita Krishnaswamy
We use this to relate the diffusion coordinates of each dataset through our assumption of partial feature correspondence.
no code implementations • 27 Sep 2018 • Scott Gigante, David van Dijk, Kevin R. Moon, Alexander Strzalkowski, Katie Ferguson, Guy Wolf, Smita Krishnaswamy
DyMoN is well-suited to the idiosyncrasies of biological data, including noise, sparsity, and the lack of longitudinal measurements in many types of systems.
1 code implementation • 14 Feb 2018 • Ofir Lindenbaum, Jay S. Stanley III, Guy Wolf, Smita Krishnaswamy
Then, it generates new points evenly along the manifold by pulling randomly generated points into its intrinsic structure using a diffusion kernel.
no code implementations • 10 Feb 2018 • Scott Gigante, David van Dijk, Kevin Moon, Alexander Strzalkowski, Guy Wolf, Smita Krishnaswamy
In order to model the dynamics of such systems given snapshot data, or local transitions, we present a deep neural network framework we call Dynamics Modeling Network or DyMoN.
no code implementations • 19 Nov 2015 • Moshe Salhov, Amit Bermanis, Guy Wolf, Amir Averbuch
In this paper, we present a representation framework for data analysis of datasets that is based on a closed-form decomposition of the measure-based kernel.