no code implementations • 17 Jun 2025 • Giulia Luise, Chin-wei Huang, Thijs Vogels, Derk P. Kooi, Sebastian Ehlert, Stephanie Lanius, Klaas J. H. Giesbertz, Amir Karton, Deniz Gunceler, Megan Stanley, Wessel P. Bruinsma, Lin Huang, Xinran Wei, José Garrido Torres, Abylay Katbashev, Rodrigo Chavez Zavaleta, Bálint Máté, Sékou-Oumar Kaba, Roberto Sordillo, Yingrong Chen, David B. Williams-Young, Christopher M. Bishop, Jan Hermann, Rianne van den Berg, Paola Gori-Giorgi
Skala achieves chemical accuracy for atomization energies of small molecules while retaining the computational efficiency typical of semi-local DFT.
no code implementations • 27 Mar 2025 • Hannah Lawrence, Vasco Portilheiro, Yan Zhang, Sékou-Oumar Kaba
However, equivariant networks cannot break symmetries: the output of an equivariant network must, by definition, have at least the same self-symmetries as the input.
no code implementations • 13 Mar 2025 • Xiusi Li, Sékou-Oumar Kaba, Siamak Ravanbakhsh
We introduce a theoretical framework that calculates the degree to which we can identify a causal model, given a set of possible interventions, up to an abstraction that describes the system at a higher level of granularity.
1 code implementation • 5 Feb 2025 • Daniel Levy, Siba Smarak Panigrahi, Sékou-Oumar Kaba, Qiang Zhu, Kin Long Kelvin Lee, Mikhail Galkin, Santiago Miret, Siamak Ravanbakhsh
Generating novel crystalline materials has the potential to lead to advancements in fields such as electronics, energy storage, and catalysis.
no code implementations • 14 Jan 2025 • Kusha Sareen, Daniel Levy, Arnab Kumar Mondal, Sékou-Oumar Kaba, Tara Akhound-Sadegh, Siamak Ravanbakhsh
Generative modeling of symmetric densities has a range of applications in AI for science, from drug discovery to physics simulations.
no code implementations • 14 Dec 2023 • Sékou-Oumar Kaba, Siamak Ravanbakhsh
Using symmetry as an inductive bias in deep learning has been proven to be a principled approach for sample-efficient model design.
no code implementations • 6 Sep 2023 • Daniel Levy, Sékou-Oumar Kaba, Carmelo Gonzales, Santiago Miret, Siamak Ravanbakhsh
We present a natural extension to E(n)-equivariant graph neural networks that uses multiple equivariant vectors per node.
no code implementations • 15 Nov 2022 • Sékou-Oumar Kaba, Siamak Ravanbakhsh
Supervised learning with deep models has tremendous potential for applications in materials science.
no code implementations • 11 Nov 2022 • Sékou-Oumar Kaba, Arnab Kumar Mondal, Yan Zhang, Yoshua Bengio, Siamak Ravanbakhsh
Symmetry-based neural networks often constrain the architecture in order to achieve invariance or equivariance to a group of transformations.
no code implementations • 29 Nov 2021 • Sékou-Oumar Kaba, Benjamin Groleau-Paré, Marc-Antoine Gauthier, André-Marie Tremblay, Simon Verret, Chloé Gauvin-Ndiaye
Crystal graph convolutional neural networks (CGCNN), materials graph network (MEGNet) and random forests are trained on the Materials Project database that contains the results of high-throughput DFT predictions.
2 code implementations • NeurIPS 2021 • Mohammad Pezeshki, Sékou-Oumar Kaba, Yoshua Bengio, Aaron Courville, Doina Precup, Guillaume Lajoie
We identify and formalize a fundamental gradient descent phenomenon resulting in a learning proclivity in over-parameterized neural networks.
Ranked #1 on
Out-of-Distribution Generalization
on ImageNet-W