no code implementations • 2 Oct 2023 • Emanuele Zappala, Daniel Levine, Sizhuang He, Syed Rizvi, Sacha Levy, David van Dijk
Deep neural networks, despite their success in numerous applications, often function without established theoretical foundations.
1 code implementation • 31 Jan 2023 • Antonio H. de O. Fonseca, Emanuele Zappala, Josue Ortega Caro, David van Dijk
Modeling spatiotemporal dynamical systems is a fundamental challenge in machine learning.
no code implementations • 17 Oct 2022 • Syed Asad Rizvi, Nhi Nguyen, Haoran Lyu, Benjamin Christensen, Josue Ortega Caro, Antonio H. O. Fonseca, Emanuele Zappala, Maryam Bagherian, Christopher Averill, Chadi G. Abdallah, Amin Karbasi, Rex Ying, Maria Brbic, Rahul Madhav Dhodapkar, David van Dijk
Foundation models have revolutionized the landscape of Deep Learning (DL), serving as a versatile platform which can be adapted to a wide range of downstream tasks.
1 code implementation • 30 Sep 2022 • Emanuele Zappala, Antonio Henrique de Oliveira Fonseca, Josue Ortega Caro, David van Dijk
In this paper, we introduce Neural Integral Equations (NIE), a method that learns an unknown integral operator from data through an IE solver.
1 code implementation • 28 Jun 2022 • Emanuele Zappala, Antonio Henrique de Oliveira Fonseca, Andrew Henry Moberly, Michael James Higley, Chadi Abdallah, Jessica Cardin, David van Dijk
Further, we show that NIDE can decompose dynamics into their Markovian and non-Markovian constituents via the learned integral operator, which we test on fMRI brain activity recordings of people on ketamine.
1 code implementation • NeurIPS Workshop TDA_and_Beyond 2020 • Arijit Sehanobish, Neal Ravindra, David van Dijk
In this work, we use a permutation invariant network to map samples from probability measures into a low-dimensional space such that the Euclidean distance between the encoded samples reflects the Wasserstein distance between probability measures.
1 code implementation • 23 Jun 2020 • Arijit Sehanobish, Hector H. Corzo, Onur Kara, David van Dijk
Attempts to apply Neural Networks (NN) to a wide range of research problems have been ubiquitous and plentiful in recent literature.
1 code implementation • 23 Jun 2020 • Arijit Sehanobish, Neal G. Ravindra, David van Dijk
A molecular and cellular understanding of how SARS-CoV-2 variably infects and causes severe COVID-19 remains a bottleneck in developing interventions to end the pandemic.
Explainable Artificial Intelligence (XAI) General Classification +3
1 code implementation • 23 Jun 2020 • Arijit Sehanobish, Neal G. Ravindra, David van Dijk
In recent years, there has been a lot of work incorporating edge features along with node features for prediction tasks.
no code implementations • 20 Jun 2020 • Antonio H. O. Fonseca, David van Dijk
Word translation is an integral part of language translation.
1 code implementation • 14 Feb 2020 • Neal G. Ravindra, Arijit Sehanobish, Jenna L. Pappalardo, David A. Hafler, David van Dijk
To the best of our knowledge, this is the first effort to use graph attention, and deep learning in general, to predict disease state from single-cell data.
2 code implementations • ICML 2020 • Alexander Tong, Jessie Huang, Guy Wolf, David van Dijk, Smita Krishnaswamy
To address this issue, we establish a link between continuous normalizing flows and dynamic optimal transport, that allows us to model the expected paths of points over time.
no code implementations • 25 Sep 2019 • Matthew Amodio, David van Dijk, Ruth Montgomery, Guy Wolf, Smita Krishnaswamy
While generative neural networks can learn to transform a specific input dataset into a specific target dataset, they require having just such a paired set of input/output datasets.
1 code implementation • 10 Jul 2019 • Nathan Brugnone, Alex Gonopolskiy, Mark W. Moyle, Manik Kuchroo, David van Dijk, Kevin R. Moon, Daniel Colon-Ramos, Guy Wolf, Matthew J. Hirn, Smita Krishnaswamy
Here, we consider multiple levels of abstraction via a multiresolution geometry of data points at different granularities.
no code implementations • ICLR 2019 • Alexander Tong, David van Dijk, Jay Stanley, Guy Wolf, Smita Krishnaswamy
First, we show a synthetic example that the graph-structured layer can reveal topological features of the data.
no code implementations • ICLR Workshop LLD 2019 • Daniel B. Burkhardt, Jay S. Stanley III, Ana Luisa Perdigoto, Scott A. Gigante, Kevan C. Herold, Guy Wolf, Antonio J. Giraldez, David van Dijk, Smita Krishnaswamy
Single-cell RNA-sequencing (scRNA-seq) is a powerful tool for analyzing biological systems.
no code implementations • 31 Jan 2019 • Scott Gigante, Jay S. Stanley III, Ngan Vu, David van Dijk, Kevin Moon, Guy Wolf, Smita Krishnaswamy
Diffusion maps are a commonly used kernel-based method for manifold learning, which can reveal intrinsic structures in data and embed them in low dimensions.
1 code implementation • 25 Jan 2019 • David van Dijk, Daniel Burkhardt, Matthew Amodio, Alex Tong, Guy Wolf, Smita Krishnaswamy
Here, we propose a reformulation of the problem such that the goal is to learn a non-linear transformation of the data into a latent archetypal space.
1 code implementation • ICLR 2019 • Alexander Tong, David van Dijk, Jay S. Stanley III, Matthew Amodio, Kristina Yim, Rebecca Muhle, James Noonan, Guy Wolf, Smita Krishnaswamy
Taking inspiration from spatial organization and localization of neuron activations in biological networks, we use a graph Laplacian penalty to structure the activations within a layer.
no code implementations • 27 Sep 2018 • Scott Gigante, David van Dijk, Kevin R. Moon, Alexander Strzalkowski, Katie Ferguson, Guy Wolf, Smita Krishnaswamy
DyMoN is well-suited to the idiosyncrasies of biological data, including noise, sparsity, and the lack of longitudinal measurements in many types of systems.
no code implementations • 10 Feb 2018 • Scott Gigante, David van Dijk, Kevin Moon, Alexander Strzalkowski, Guy Wolf, Smita Krishnaswamy
In order to model the dynamics of such systems given snapshot data, or local transitions, we present a deep neural network framework we call Dynamics Modeling Network or DyMoN.