Search Results for author: Alex Hernández-García

Found 8 papers, 5 papers with code

Towards equilibrium molecular conformation generation with GFlowNets

no code implementations20 Oct 2023 Alexandra Volokhova, Michał Koziarski, Alex Hernández-García, Cheng-Hao Liu, Santiago Miret, Pablo Lemos, Luca Thiede, Zichao Yan, Alán Aspuru-Guzik, Yoshua Bengio

Sampling diverse, thermodynamically feasible molecular conformations plays a crucial role in predicting properties of a molecule.

A theory of continuous generative flow networks

1 code implementation30 Jan 2023 Salem Lahlou, Tristan Deleu, Pablo Lemos, Dinghuai Zhang, Alexandra Volokhova, Alex Hernández-García, Léna Néhale Ezzine, Yoshua Bengio, Nikolay Malkin

Generative flow networks (GFlowNets) are amortized variational inference algorithms that are trained to sample from unnormalized target distributions over compositional objects.

Variational Inference

PhAST: Physics-Aware, Scalable, and Task-specific GNNs for Accelerated Catalyst Design

2 code implementations22 Nov 2022 Alexandre Duval, Victor Schmidt, Santiago Miret, Yoshua Bengio, Alex Hernández-García, David Rolnick

Catalyst materials play a crucial role in the electrochemical reactions involved in numerous industrial processes key to this transition, such as renewable energy storage and electrofuel synthesis.

Computational Efficiency

Rethinking supervised learning: insights from biological learning and from calling it by its name

no code implementations NeurIPS 2021 Alex Hernández-García

The renaissance of artificial neural networks was catalysed by the success of classification models, tagged by the community with the broader term supervised learning.

Self-Supervised Learning

Further advantages of data augmentation on convolutional neural networks

no code implementations26 Jun 2019 Alex Hernández-García, Peter König

As a matter of fact, convolutional neural networks for image object classification are typically trained with both data augmentation and explicit regularization, assuming the benefits of all techniques are complementary.

Data Augmentation

Learning robust visual representations using data augmentation invariance

1 code implementation11 Jun 2019 Alex Hernández-García, Peter König, Tim C. Kietzmann

Deep convolutional neural networks trained for image object categorization have shown remarkable similarities with representations found across the primate ventral visual stream.

Data Augmentation Object Categorization

Data augmentation instead of explicit regularization

2 code implementations ICLR 2018 Alex Hernández-García, Peter König

Despite the fact that some (explicit) regularization techniques, such as weight decay and dropout, require costly fine-tuning of sensitive hyperparameters, the interplay between them and other elements that provide implicit regularization is not well understood yet.

Data Augmentation Object Categorization

Do deep nets really need weight decay and dropout?

1 code implementation20 Feb 2018 Alex Hernández-García, Peter König

The impressive success of modern deep neural networks on computer vision tasks has been achieved through models of very large capacity compared to the number of available training examples.

Data Augmentation Object Recognition

Cannot find the paper you are looking for? You can Submit a new open access paper.