1 code implementation • 6 Oct 2024 • Hyeonah Kim, Minsu Kim, Taeyoung Yun, Sanghyeok Choi, Emmanuel Bengio, Alex Hernández-García, Jinkyoo Park
Although these approaches have shown promise in generating diverse and novel sequences, the limited training data relative to the vast search space often leads to the misspecification of proxy for out-of-distribution inputs.
no code implementations • 20 Oct 2023 • Alexandra Volokhova, Michał Koziarski, Alex Hernández-García, Cheng-Hao Liu, Santiago Miret, Pablo Lemos, Luca Thiede, Zichao Yan, Alán Aspuru-Guzik, Yoshua Bengio
Sampling diverse, thermodynamically feasible molecular conformations plays a crucial role in predicting properties of a molecule.
1 code implementation • 30 Jan 2023 • Salem Lahlou, Tristan Deleu, Pablo Lemos, Dinghuai Zhang, Alexandra Volokhova, Alex Hernández-García, Léna Néhale Ezzine, Yoshua Bengio, Nikolay Malkin
Generative flow networks (GFlowNets) are amortized variational inference algorithms that are trained to sample from unnormalized target distributions over compositional objects.
2 code implementations • 22 Nov 2022 • Alexandre Duval, Victor Schmidt, Santiago Miret, Yoshua Bengio, Alex Hernández-García, David Rolnick
Catalyst materials play a crucial role in the electrochemical reactions involved in numerous industrial processes key to this transition, such as renewable energy storage and electrofuel synthesis.
no code implementations • NeurIPS 2021 • Alex Hernández-García
The renaissance of artificial neural networks was catalysed by the success of classification models, tagged by the community with the broader term supervised learning.
no code implementations • 26 Jun 2019 • Alex Hernández-García, Peter König
As a matter of fact, convolutional neural networks for image object classification are typically trained with both data augmentation and explicit regularization, assuming the benefits of all techniques are complementary.
1 code implementation • 11 Jun 2019 • Alex Hernández-García, Peter König, Tim C. Kietzmann
Deep convolutional neural networks trained for image object categorization have shown remarkable similarities with representations found across the primate ventral visual stream.
2 code implementations • ICLR 2018 • Alex Hernández-García, Peter König
Despite the fact that some (explicit) regularization techniques, such as weight decay and dropout, require costly fine-tuning of sensitive hyperparameters, the interplay between them and other elements that provide implicit regularization is not well understood yet.
1 code implementation • 20 Feb 2018 • Alex Hernández-García, Peter König
The impressive success of modern deep neural networks on computer vision tasks has been achieved through models of very large capacity compared to the number of available training examples.