1 code implementation • 30 May 2023 • Peiman Mohseni, Nick Duffield, Bani Mallick, Arman Hasanzadeh
Neural processes are a family of probabilistic models that inherit the flexibility of neural networks to parameterize stochastic processes.
no code implementations • ICLR 2022 • Arman Hasanzadeh, Ehsan Hajiramezanali, Nick Duffield, Xiaoning Qian
Multi-omics data analysis has the potential to discover hidden molecular interactions, revealing potential regulatory and/or signal transduction pathways for cellular processes of interest when studying life and disease systems.
no code implementations • 15 Dec 2021 • Arman Hasanzadeh, Mohammadreza Armandpour, Ehsan Hajiramezanali, Mingyuan Zhou, Nick Duffield, Krishna Narayanan
By learning distributional representations, we provide uncertainty estimates in downstream graph analytics tasks and increase the expressive power of the predictive model.
1 code implementation • NeurIPS 2020 • Ehsan Hajiramezanali, Arman Hasanzadeh, Nick Duffield, Krishna R Narayanan, Xiaoning Qian
High-throughput molecular profiling technologies have produced high-dimensional multi-omics data, enabling systematic understanding of living systems at the genome scale.
1 code implementation • ICML 2020 • Arman Hasanzadeh, Ehsan Hajiramezanali, Shahin Boluki, Mingyuan Zhou, Nick Duffield, Krishna Narayanan, Xiaoning Qian
We propose a unified framework for adaptive connection sampling in graph neural networks (GNNs) that generalizes existing stochastic regularization methods for training GNNs.
1 code implementation • 16 Apr 2020 • Mostafa Karimi, Arman Hasanzadeh, Yang shen
We have developed the first deep generative model for drug combination design, by jointly embedding graph-structured domain knowledge and iteratively training a reinforcement learning-based chemical graph-set designer.
no code implementations • 28 Oct 2019 • Ehsan Hajiramezanali, Arman Hasanzadeh, Nick Duffield, Krishna Narayanan, Mingyuan Zhou, Xiaoning Qian
Stochastic recurrent neural networks with latent random variables of complex dependency structures have shown to be more successful in modeling sequential data than deterministic deep models.
2 code implementations • NeurIPS 2019 • Ehsan Hajiramezanali, Arman Hasanzadeh, Nick Duffield, Krishna R. Narayanan, Mingyuan Zhou, Xiaoning Qian
Representation learning over graph structured data has been mostly studied in static graph settings while efforts for modeling dynamic graphs are still scant.
Ranked #2 on Dynamic Link Prediction on Enron Emails
1 code implementation • NeurIPS 2019 • Arman Hasanzadeh, Ehsan Hajiramezanali, Nick Duffield, Krishna R. Narayanan, Mingyuan Zhou, Xiaoning Qian
Compared to VGAE, the derived graph latent representations by SIG-VAE are more interpretable, due to more expressive generative model and more faithful inference enabled by the flexible semi-implicit construction.
no code implementations • 3 Jul 2019 • Arman Hasanzadeh, Nagaraj T. Janakiraman, Vamsi K. Amalladinne, Krishna R. Narayanan
In this work, we leverage advances in sparse coding techniques to reduce the number of trainable parameters in a fully connected neural network.
no code implementations • 19 Nov 2017 • Arman Hasanzadeh, Xi Liu, Nick Duffield, Krishna R. Narayanan, Byron Chigoy
Building a prediction model for transportation networks is challenging because spatio-temporal dependencies of traffic data in different roads are complex and the graph constructed from road networks is very large.