1 code implementation • 31 Oct 2022 • Santiago Miret, Kin Long Kelvin Lee, Carmelo Gonzales, Marcel Nassar, Matthew Spellings
We present the Open MatSci ML Toolkit: a flexible, self-contained, and scalable Python-based framework to apply deep learning models and methods on scientific data with a specific focus on materials science and the OpenCatalyst Dataset.
1 code implementation • 12 Sep 2023 • Kin Long Kelvin Lee, Carmelo Gonzales, Marcel Nassar, Matthew Spellings, Mikhail Galkin, Santiago Miret
We propose MatSci ML, a novel benchmark for modeling MATerials SCIence using Machine Learning (MatSci ML) methods focused on solid-state materials with periodic crystal structures.
1 code implementation • NeurIPS 2021 • Sami Abu-El-Haija, Hesham Mostafa, Marcel Nassar, Valentino Crespi, Greg Ver Steeg, Aram Galstyan
Recent improvements in the performance of state-of-the-art (SOTA) methods for Graph Representational Learning (GRL) have come at the cost of significant computational resource requirements for training, e. g., for calculating gradients via backprop over many data epochs.
1 code implementation • 29 Mar 2022 • Kourosh Hakhamaneshi, Marcel Nassar, Mariano Phielipp, Pieter Abbeel, Vladimir Stojanović
We show that pretraining GNNs on prediction of output node voltages can encourage learning representations that can be adapted to new unseen topologies or prediction of new circuit level properties with up to 10x more sample efficiency compared to a randomly initialized model.
1 code implementation • 18 Dec 2021 • Shengyu Feng, Subarna Tripathi, Hesham Mostafa, Marcel Nassar, Somdeb Majumdar
Dynamic scene graph generation from a video is challenging due to the temporal dynamics of the scene and the inherent temporal fluctuations of predictions.
no code implementations • NeurIPS 2017 • Urs Köster, Tristan J. Webb, Xin Wang, Marcel Nassar, Arjun K. Bansal, William H. Constable, Oğuz H. Elibol, Scott Gray, Stewart Hall, Luke Hornof, Amir Khosrowshahi, Carey Kloss, Ruby J. Pai, Naveen Rao
Here we present the Flexpoint data format, aiming at a complete replacement of 32-bit floating point format training and inference, designed to support modern deep network topologies without modifications.
no code implementations • 7 Jun 2013 • Marcel Nassar, Philip Schniter, Brian L. Evans
We propose a novel receiver for orthogonal frequency division multiplexing (OFDM) transmissions in impulsive noise environments.
no code implementations • 17 Nov 2018 • Marcel Nassar
Recently, graph neural networks have been adopted in a wide variety of applications ranging from relational representations to modeling irregular data domains such as point clouds and social graphs.
no code implementations • 13 Dec 2018 • Marcel Nassar, Xin Wang, Evren Tumer
Thus, we refer to our model as Conditional Graph Neural Process (CGNP).
no code implementations • ICLR 2020 • Marcel Nassar, Xin Wang, Evren Tumer
Graph neural networks have been adopted in numerous applications ranging from learning relational representations to modeling data on irregular domains such as point clouds, social graphs, and molecular structures.
no code implementations • 2 Mar 2020 • Hesham Mostafa, Marcel Nassar
The attention coefficients depend on the Euclidean distance between learnable node embeddings, and we show that the resulting attention-based global aggregation scheme is analogous to high-dimensional Gaussian filtering.
no code implementations • 6 Apr 2021 • Daniel Cummings, Marcel Nassar
Academic citation graphs represent citation relationships between publications across the full range of academic fields.
no code implementations • 6 Jun 2021 • Hesham Mostafa, Marcel Nassar, Somdeb Majumdar
We also show that homophily is a poor measure of the information in a node's local neighborhood and propose the Neighborhood Information Content(NIC) metric, which is a novel information-theoretic graph metric.