1 code implementation • 29 Nov 2024 • Cristian Gabellini, Nikhil Shenoy, Stephan Thaler, Semih Canturk, Daniel McNeela, Dominique Beaini, Michael Bronstein, Prudencio Tossou
Machine Learning Interatomic Potentials (MLIPs) are a highly promising alternative to force-fields for molecular dynamics (MD) simulations, offering precise and rapid energy and force calculations.
1 code implementation • 29 Oct 2024 • Majdi Hassan, Nikhil Shenoy, Jungyoon Lee, Hannes Stark, Stephan Thaler, Dominique Beaini
Predicting low-energy molecular conformations given a molecular graph is an important but challenging task in computational drug discovery.
no code implementations • 10 Sep 2024 • Philip Fradkin, Puria Azadi, Karush Suri, Frederik Wenkel, Ali Bashashati, Maciej Sypetkowski, Dominique Beaini
Predicting molecular impact on cellular function is a core challenge in therapeutic design.
no code implementations • 17 Apr 2024 • Maciej Sypetkowski, Frederik Wenkel, Farimah Poursafaei, Nia Dickson, Karush Suri, Philip Fradkin, Dominique Beaini
However, structure-based architectures such as Graph Neural Networks (GNNs) are yet to show the benefits of scale mainly due to the lower efficiency of sparse operations, large data requirements, and lack of clarity about the effectiveness of various architectures.
1 code implementation • CVPR 2024 • Oren Kraus, Kian Kenyon-Dean, Saber Saberian, Maryam Fallah, Peter McLean, Jess Leung, Vasudev Sharma, Ayla Khan, Jia Balakrishnan, Safiye Celik, Dominique Beaini, Maciej Sypetkowski, Chi Vicky Cheng, Kristen Morse, Maureen Makes, Ben Mabey, Berton Earnshaw
Featurizing microscopy images for use in biological research remains a significant challenge, especially for large-scale experiments spanning millions of images.
2 code implementations • NeurIPS 2023 • Alexander Mathiasen, Hatem Helal, Kerstin Klaser, Paul Balanca, Josef Dean, Carlo Luschi, Dominique Beaini, Andrew Fitzgibbon, Dominic Masters
Similar benefits are yet to be unlocked for quantum chemistry, where the potential of deep learning is constrained by comparatively small datasets with 100k to 20M training examples.
no code implementations • 30 Oct 2023 • Nikhil Shenoy, Prudencio Tossou, Emmanuel Noutahi, Hadrien Mary, Dominique Beaini, Jiarui Ding
In the field of Machine Learning Interatomic Potentials (MLIPs), understanding the intricate relationship between data biases, specifically conformational and structural diversity, and model generalization is critical in improving the quality of Quantum Mechanics (QM) data generation efforts.
1 code implementation • 6 Oct 2023 • Dominique Beaini, Shenyang Huang, Joao Alex Cunha, Zhiyi Li, Gabriela Moisescu-Pareja, Oleksandr Dymov, Samuel Maddrell-Mander, Callum McLean, Frederik Wenkel, Luis Müller, Jama Hussein Mohamud, Ali Parviz, Michael Craig, Michał Koziarski, Jiarui Lu, Zhaocheng Zhu, Cristian Gabellini, Kerstin Klaser, Josef Dean, Cas Wognum, Maciej Sypetkowski, Guillaume Rabusseau, Reihaneh Rabbany, Jian Tang, Christopher Morris, Ioannis Koutis, Mirco Ravanelli, Guy Wolf, Prudencio Tossou, Hadrien Mary, Therence Bois, Andrew Fitzgibbon, Błażej Banaszewski, Chad Martin, Dominic Masters
Recently, pre-trained foundation models have enabled significant advancements in multiple fields.
1 code implementation • 14 Jul 2023 • Semih Cantürk, Renming Liu, Olivier Lapointe-Gagné, Vincent Létourneau, Guy Wolf, Dominique Beaini, Ladislav Rampášek
Positional and structural encodings (PSE) enable better identifiability of nodes within a graph, rendering them essential tools for empowering modern GNNs, and in particular graph Transformers.
1 code implementation • 6 Feb 2023 • Dominic Masters, Josef Dean, Kerstin Klaser, Zhiyi Li, Sam Maddrell-Mander, Adam Sanders, Hatem Helal, Deniz Beker, Andrew Fitzgibbon, Shenyang Huang, Ladislav Rampášek, Dominique Beaini
We present GPS++, a hybrid Message Passing Neural Network / Graph Transformer model for molecular property prediction.
1 code implementation • 27 Jan 2023 • Xiangyu Zhao, Hannes Stärk, Dominique Beaini, Yiren Zhao, Pietro Liò
Existing GNN benchmarking methods for molecular representation learning focus on comparing the GNNs' performances on some node/graph classification/regression tasks on certain datasets.
1 code implementation • 18 Nov 2022 • Dominic Masters, Josef Dean, Kerstin Klaser, Zhiyi Li, Sam Maddrell-Mander, Adam Sanders, Hatem Helal, Deniz Beker, Ladislav Rampášek, Dominique Beaini
This technical report presents GPS++, the first-place solution to the Open Graph Benchmark Large-Scale Challenge (OGB-LSC 2022) for the PCQM4Mv2 molecular property prediction task.
2 code implementations • 16 Jun 2022 • Vijay Prakash Dwivedi, Ladislav Rampášek, Mikhail Galkin, Ali Parviz, Guy Wolf, Anh Tuan Luu, Dominique Beaini
Graph Neural Networks (GNNs) that are based on the message passing (MP) paradigm generally exchange information between 1-hop neighbors to build node representations at each layer.
Ranked #3 on
Link Prediction
on PCQM-Contact
4 code implementations • 25 May 2022 • Ladislav Rampášek, Mikhail Galkin, Vijay Prakash Dwivedi, Anh Tuan Luu, Guy Wolf, Dominique Beaini
We propose a recipe on how to build a general, powerful, scalable (GPS) graph Transformer with linear complexity and state-of-the-art results on a diverse set of benchmarks.
Ranked #1 on
Graph Property Prediction
on ogbg-ppa
1 code implementation • NeurIPS Workshop AI4Scien 2021 • Hannes Stärk, Dominique Beaini, Gabriele Corso, Prudencio Tossou, Christian Dallago, Stephan Günnemann, Pietro Liò
Molecular property prediction is one of the fastest-growing applications of deep learning with critical real-world impacts.
no code implementations • 29 Sep 2021 • Hannes Stärk, Dominique Beaini, Gabriele Corso, Prudencio Tossou, Christian Dallago, Stephan Günnemann, Pietro Lio
Molecular property prediction is one of the fastest-growing applications of deep learning with critical real-world impacts.
1 code implementation • NeurIPS 2021 • Devin Kreuzer, Dominique Beaini, William L. Hamilton, Vincent Létourneau, Prudencio Tossou
Here, we present the $\textit{Spectral Attention Network}$ (SAN), which uses a learned positional encoding (LPE) that can take advantage of the full Laplacian spectrum to learn the position of each node in a given graph.
2 code implementations • 6 Oct 2020 • Dominique Beaini, Saro Passaro, Vincent Létourneau, William L. Hamilton, Gabriele Corso, Pietro Liò
Then, we propose the use of the Laplacian eigenvectors as such vector field.
Ranked #2 on
Node Classification
on PATTERN 100k
8 code implementations • NeurIPS 2020 • Gabriele Corso, Luca Cavalleri, Dominique Beaini, Pietro Liò, Petar Veličković
Graph Neural Networks (GNNs) have been shown to be effective models for different predictive tasks on graph-structured data.
Ranked #1 on
Graph Regression
on KIT
no code implementations • 11 Mar 2020 • Dominique Beaini, Sofiane Achiche, Maxime Raison
Current research in convolutional neural networks (CNN) focuses mainly on changing the architecture of the networks, optimizing the hyper-parameters and improving the gradient descent.
no code implementations • 11 Feb 2020 • Dominique Beaini, Sofiane Achiche, Alexandre Duperre, Maxime Raison
In recent years, there has been a rapid progress in solving the binary problems in computer vision, such as edge detection which finds the boundaries of an image and salient object detection which finds the important object in an image.
no code implementations • 22 Aug 2019 • Dominique Beaini, Sofiane Achiche, Alexandre Duperré, Maxime Raison
The objective of this paper is to show that saliency convolutional neural networks (CNN) can be improved by using a Green's function convolution (GFC) to extrapolate edges features into salient regions.
no code implementations • 28 May 2019 • Emmanuel Noutahi, Dominique Beaini, Julien Horwood, Sébastien Giguère, Prudencio Tossou
We benchmark LaPool on molecular graph prediction and understanding tasks and show that it outperforms recent GNNs.
1 code implementation • 1 Feb 2019 • Dominique Beaini, Sofiane Achiche, Fabrice Nonez, Olivier Brochu Dufour, Cédric Leblond-Ménard, Mahdis Asaadi, Maxime Raison
The objective of this paper is to present a novel fast and robust method of solving the image gradient or Laplacian with minimal error, which can be used for gradient domain editing.
no code implementations • 20 Jun 2018 • Dominique Beaini, Sofiane Achiche, Yann-Seing Law-Kam Cio, Maxime Raison
The objective of this paper is to present a novel convolution kernels, based on principles of electromagnetic potentials and fields, for a general use in computer vision and to demonstrate its usage for shape and stroke analysis.
no code implementations • 4 Jun 2018 • Dominique Beaini, Sofiane Achiche, Fabrice Nonez, Maxime Raison
Hence, it becomes possible to generate a continuous space of probability based only on the edge information, thus bridging the gap between the edge-based methods and the region-based methods.