You need to log in to edit.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

no code implementations • 28 Mar 2022 • Guillaume Huguet, Alexander Tong, Bastian Rieck, Jessie Huang, Manik Kuchroo, Matthew Hirn, Guy Wolf, Smita Krishnaswamy

From a geometric perspective, we obtain convergence bounds based on the smallest transition probability and the radius of the data, whereas from a spectral perspective, our bounds are based on the eigenspectrum of the diffusion kernel.

no code implementations • 22 Jan 2022 • Frederik Wenkel, Yimeng Min, Matthew Hirn, Michael Perlmutter, Guy Wolf

However, current GNN models (and GCNs in particular) are known to be constrained by various phenomena that limit their expressive power and ability to generalize to more complex graph datasets.

no code implementations • 22 Dec 2021 • Jessie Huang, Erica L. Busch, Tom Wallenstein, Michal Gerasimiuk, Andrew Benz, Guillaume Lajoie, Guy Wolf, Nicholas B. Turk-Browne, Smita Krishnaswamy

In order to understand the connection between stimuli of interest and brain activity, and analyze differences and commonalities between subjects, it becomes important to learn a meaningful embedding of the data that denoises, and reveals its intrinsic structure.

1 code implementation • 19 Nov 2021 • Michal Gerasimiuk, Dennis Shung, Alexander Tong, Adrian Stanley, Michael Schultz, Jeffrey Ngu, Loren Laine, Guy Wolf, Smita Krishnaswamy

In particular, in EHR data, some variables are {\em missing not at random (MNAR)} but deliberately not collected and thus are a source of information.

no code implementations • 7 Nov 2021 • Guy Wolf, Gil Shabat, Hanan Shteingart

Positivity is one of the three conditions for causal inference from observational data.

no code implementations • 27 Oct 2021 • Renming Liu, Semih Cantürk, Frederik Wenkel, Dylan Sandfelder, Devin Kreuzer, Anna Little, Sarah McGuire, Leslie O'Bray, Michael Perlmutter, Bastian Rieck, Matthew Hirn, Guy Wolf, Ladislav Rampášek

Graph neural networks (GNNs) have attracted much attention due to their ability to leverage the intrinsic geometries of the underlying data.

no code implementations • 29 Sep 2021 • Shanel Gauthier, Benjamin Thérien, Laurent Alsène-Racicot, Muawiz Sajjad Chaudhary, Irina Rish, Eugene Belilovsky, Michael Eickenberg, Guy Wolf

The wavelet filters used in the scattering transform are typically selected to create a tight frame via a parameterized mother wavelet.

no code implementations • 26 Jul 2021 • Alexander Tong, Guillaume Huguet, Dennis Shung, Amine Natik, Manik Kuchroo, Guillaume Lajoie, Guy Wolf, Smita Krishnaswamy

We propose to compare and organize such datasets of graph signals by using an earth mover's distance (EMD) with a geodesic cost over the underlying graph.

1 code implementation • 20 Jul 2021 • Shanel Gauthier, Benjamin Thérien, Laurent Alsène-Racicot, Muawiz Chaudhary, Irina Rish, Eugene Belilovsky, Michael Eickenberg, Guy Wolf

The wavelet scattering transform creates geometric invariants and deformation stability.

1 code implementation • 15 Jul 2021 • Ladislav Rampášek, Guy Wolf

Graph neural networks (GNNs) based on message passing between neighboring nodes are known to be insufficient for capturing long-range interactions in graphs.

1 code implementation • 25 Feb 2021 • Alexander Tong, Guillaume Huguet, Amine Natik, Kincaid MacDonald, Manik Kuchroo, Ronald Coifman, Guy Wolf, Smita Krishnaswamy

Here, Diffusion EMD can derive distances between patients on the manifold of cells at least two orders of magnitude faster than equally accurate methods.

no code implementations • 12 Feb 2021 • Manik Kuchroo, Abhinav Godavarthi, Alexander Tong, Guy Wolf, Smita Krishnaswamy

We propose a method called integrated diffusion for combining multimodal datasets, or data gathered via several different measurements on the same system, to create a joint data diffusion operator.

no code implementations • 31 Jan 2021 • Stefan Horoi, Jessie Huang, Bastian Rieck, Guillaume Lajoie, Guy Wolf, Smita Krishnaswamy

This suggests that qualitative and quantitative examination of the loss landscape geometry could yield insights about neural network generalization performance during training.

no code implementations • 1 Jan 2021 • Yewen Wang, Jian Tang, Yizhou Sun, Guy Wolf

We empirically analyse our proposed DGL-GNN model, and demonstrate its effectiveness and superior efficiency through a range of experiments.

1 code implementation • 28 Oct 2020 • Yimeng Min, Frederik Wenkel, Guy Wolf

Geometric scattering has recently gained recognition in graph representation learning, and recent work has shown that integrating scattering features in graph convolution networks (GCNs) can alleviate the typical oversmoothing of features in node representation learning.

no code implementations • 6 Oct 2020 • Alexander Tong, Frederik Wenkel, Kincaid MacDonald, Smita Krishnaswamy, Guy Wolf

We propose a new graph neural network (GNN) module, based on relaxations of recently proposed geometric scattering transforms, which consist of a cascade of graph wavelet filters.

1 code implementation • 14 Jul 2020 • Andrés F. Duque, Sacha Morin, Guy Wolf, Kevin R. Moon

Our regularization, based on the diffusion potential distances from the recently-proposed PHATE visualization method, encourages the learned latent representation to follow intrinsic data geometry, similar to manifold learning algorithms, while still enabling faithful extension to new data and reconstruction of data in the original feature space from latent coordinates.

no code implementations • 22 Jun 2020 • Victor Geadah, Giancarlo Kerg, Stefan Horoi, Guy Wolf, Guillaume Lajoie

Dynamic adaptation in single-neuron response plays a fundamental role in neural coding in biological neural networks.

no code implementations • 15 Jun 2020 • Jake S. Rhodes, Adele Cutler, Guy Wolf, Kevin R. Moon

We show, both qualitatively and quantitatively, the advantages of our approach in retaining local and global structures in data, while emphasizing important variables in the low-dimensional embedding.

1 code implementation • NeurIPS 2020 • Bastian Rieck, Tristan Yates, Christian Bock, Karsten Borgwardt, Guy Wolf, Nicholas Turk-Browne, Smita Krishnaswamy

We observe significant differences in both brain state trajectories and overall topological activity between adults and children watching the same movie.

2 code implementations • 12 Jun 2020 • Egbert Castro, Andrew Benz, Alexander Tong, Guy Wolf, Smita Krishnaswamy

We propose a geometric scattering autoencoder (GSAE) network for learning such graph embeddings.

1 code implementation • NeurIPS 2020 • Yimeng Min, Frederik Wenkel, Guy Wolf

Graph convolutional networks (GCNs) have shown promising results in processing graph data by extracting structure-aware features.

1 code implementation • 17 Feb 2020 • Mostafa ElAraby, Guy Wolf, Margarida Carvalho

We introduce a mixed integer program (MIP) for assigning importance scores to each neuron in deep neural network architectures which is guided by the impact of their simultaneous pruning on the main learning task of the network.

2 code implementations • ICML 2020 • Alexander Tong, Jessie Huang, Guy Wolf, David van Dijk, Smita Krishnaswamy

To address this issue, we establish a link between continuous normalizing flows and dynamic optimal transport, that allows us to model the expected paths of points over time.

no code implementations • 9 Jan 2020 • Stefan Horoi, Guillaume Lajoie, Guy Wolf

The efficiency of recurrent neural networks (RNNs) in dealing with sequential data has long been established.

no code implementations • 14 Nov 2019 • Michael Perlmutter, Feng Gao, Guy Wolf, Matthew Hirn

As a result, the proposed construction unifies and extends known theoretical results for many of the existing graph scattering architectures.

no code implementations • 25 Sep 2019 • Matthew Amodio, David van Dijk, Ruth Montgomery, Guy Wolf, Smita Krishnaswamy

While generative neural networks can learn to transform a specific input dataset into a specific target dataset, they require having just such a paired set of input/output datasets.

1 code implementation • 10 Jul 2019 • Nathan Brugnone, Alex Gonopolskiy, Mark W. Moyle, Manik Kuchroo, David van Dijk, Kevin R. Moon, Daniel Colon-Ramos, Guy Wolf, Matthew J. Hirn, Smita Krishnaswamy

Here, we consider multiple levels of abstraction via a multiresolution geometry of data points at different granularities.

no code implementations • 25 Jun 2019 • Andrés F. Duque, Guy Wolf, Kevin R. Moon

Manifold learning techniques for dynamical systems and time series have shown their utility for a broad spectrum of applications in recent years.

no code implementations • 26 May 2019 • Alexander Tong, Guy Wolf, Smita Krishnaswamy

We show that this procedure successfully detects unseen anomalies with guarantees on those that have a certain Wasserstein distance from the data or corrupted training set.

no code implementations • 24 May 2019 • Michael Perlmutter, Feng Gao, Guy Wolf, Matthew Hirn

The Euclidean scattering transform was introduced nearly a decade ago to improve the mathematical understanding of convolutional neural networks.

no code implementations • 17 May 2019 • Aude Forcione-Lambert, Guy Wolf, Guillaume Lajoie

We investigate the learned dynamical landscape of a recurrent neural network solving a simple task requiring the interaction of two memory mechanisms: long- and short-term.

no code implementations • ICLR 2019 • Jay S. Stanley III, Guy Wolf, Smita Krishnaswamy

We leverage this assumption to estimate relations between intrinsic manifold dimensions, which are given by diffusion map coordinates over each of the datasets.

no code implementations • ICLR 2019 • Alexander Tong, David van Dijk, Jay Stanley, Guy Wolf, Smita Krishnaswamy

First, we show a synthetic example that the graph-structured layer can reveal topological features of the data.

no code implementations • ICLR 2019 • Feng Gao, Guy Wolf, Matthew Hirn

Furthermore, ConvNets inspired recent advances in geometric deep learning, which aim to generalize these networks to graph data by applying notions from graph signal processing to learn deep graph filter cascades.

no code implementations • ICLR Workshop LLD 2019 • Daniel B. Burkhardt, Jay S. Stanley III, Ana Luisa Perdigoto, Scott A. Gigante, Kevan C. Herold, Guy Wolf, Antonio J. Giraldez, David van Dijk, Smita Krishnaswamy

Single-cell RNA-sequencing (scRNA-seq) is a powerful tool for analyzing biological systems.

no code implementations • 31 Jan 2019 • Scott Gigante, Jay S. Stanley III, Ngan Vu, David van Dijk, Kevin Moon, Guy Wolf, Smita Krishnaswamy

Diffusion maps are a commonly used kernel-based method for manifold learning, which can reveal intrinsic structures in data and embed them in low dimensions.

1 code implementation • 25 Jan 2019 • David van Dijk, Daniel Burkhardt, Matthew Amodio, Alex Tong, Guy Wolf, Smita Krishnaswamy

Here, we propose a reformulation of the problem such that the goal is to learn a non-linear transformation of the data into a latent archetypal space.

no code implementations • 15 Dec 2018 • Michael Perlmutter, Guy Wolf, Matthew Hirn

The Euclidean scattering transform was introduced nearly a decade ago to improve the mathematical understanding of the success of convolutional neural networks (ConvNets) in image data analysis and other tasks.

no code implementations • NeurIPS 2018 • Ofir Lindenbaum, Jay Stanley, Guy Wolf, Smita Krishnaswamy

We propose a new type of generative model for high-dimensional data that learns a manifold geometry of the data, rather than density, and can generate points evenly along this manifold.

no code implementations • ICLR 2019 • Feng Gao, Guy Wolf, Matthew Hirn

We explore the generalization of scattering transforms from traditional (e. g., image or audio) signals to graph data, analogous to the generalization of ConvNets in geometric deep learning, and the utility of extracted graph features in graph data analysis.

1 code implementation • ICLR 2019 • Alexander Tong, David van Dijk, Jay S. Stanley III, Matthew Amodio, Kristina Yim, Rebecca Muhle, James Noonan, Guy Wolf, Smita Krishnaswamy

Taking inspiration from spatial organization and localization of neuron activations in biological networks, we use a graph Laplacian penalty to structure the activations within a layer.

no code implementations • 30 Sep 2018 • Jay S. Stanley III, Scott Gigante, Guy Wolf, Smita Krishnaswamy

We use this to relate the diffusion coordinates of each dataset through our assumption of partial feature correspondence.

no code implementations • 27 Sep 2018 • Scott Gigante, David van Dijk, Kevin R. Moon, Alexander Strzalkowski, Katie Ferguson, Guy Wolf, Smita Krishnaswamy

DyMoN is well-suited to the idiosyncrasies of biological data, including noise, sparsity, and the lack of longitudinal measurements in many types of systems.

1 code implementation • 14 Feb 2018 • Ofir Lindenbaum, Jay S. Stanley III, Guy Wolf, Smita Krishnaswamy

Then, it generates new points evenly along the manifold by pulling randomly generated points into its intrinsic structure using a diffusion kernel.

no code implementations • 10 Feb 2018 • Scott Gigante, David van Dijk, Kevin Moon, Alexander Strzalkowski, Guy Wolf, Smita Krishnaswamy

In order to model the dynamics of such systems given snapshot data, or local transitions, we present a deep neural network framework we call Dynamics Modeling Network or DyMoN.

no code implementations • 19 Nov 2015 • Moshe Salhov, Amit Bermanis, Guy Wolf, Amir Averbuch

In this paper, we present a representation framework for data analysis of datasets that is based on a closed-form decomposition of the measure-based kernel.

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.