no code implementations • 14 Dec 2023 • Sékou-Oumar Kaba, Siamak Ravanbakhsh
Using symmetry as an inductive bias in deep learning has been proven to be a principled approach for sample-efficient model design.
no code implementations • 6 Sep 2023 • Daniel Levy, Sékou-Oumar Kaba, Carmelo Gonzales, Santiago Miret, Siamak Ravanbakhsh
We present a natural extension to E(n)-equivariant graph neural networks that uses multiple equivariant vectors per node.
no code implementations • 15 Nov 2022 • Sékou-Oumar Kaba, Siamak Ravanbakhsh
Supervised learning with deep models has tremendous potential for applications in materials science.
no code implementations • 11 Nov 2022 • Sékou-Oumar Kaba, Arnab Kumar Mondal, Yan Zhang, Yoshua Bengio, Siamak Ravanbakhsh
Symmetry-based neural networks often constrain the architecture in order to achieve invariance or equivariance to a group of transformations.
no code implementations • 29 Nov 2021 • Sékou-Oumar Kaba, Benjamin Groleau-Paré, Marc-Antoine Gauthier, André-Marie Tremblay, Simon Verret, Chloé Gauvin-Ndiaye
Crystal graph convolutional neural networks (CGCNN), materials graph network (MEGNet) and random forests are trained on the Materials Project database that contains the results of high-throughput DFT predictions.
2 code implementations • NeurIPS 2021 • Mohammad Pezeshki, Sékou-Oumar Kaba, Yoshua Bengio, Aaron Courville, Doina Precup, Guillaume Lajoie
We identify and formalize a fundamental gradient descent phenomenon resulting in a learning proclivity in over-parameterized neural networks.
Ranked #1 on Out-of-Distribution Generalization on ImageNet-W