Search Results for author: Maksym Korablyov

Found 9 papers, 5 papers with code

Thompson sampling for improved exploration in GFlowNets

no code implementations30 Jun 2023 Jarrid Rector-Brooks, Kanika Madan, Moksh Jain, Maksym Korablyov, Cheng-Hao Liu, Sarath Chandar, Nikolay Malkin, Yoshua Bengio

Generative flow networks (GFlowNets) are amortized variational inference algorithms that treat sampling from a distribution over compositional objects as a sequential decision-making problem with a learnable action policy.

Active Learning Decision Making +3

Learning GFlowNets from partial episodes for improved convergence and stability

3 code implementations26 Sep 2022 Kanika Madan, Jarrid Rector-Brooks, Maksym Korablyov, Emmanuel Bengio, Moksh Jain, Andrei Nica, Tom Bosc, Yoshua Bengio, Nikolay Malkin

Generative flow networks (GFlowNets) are a family of algorithms for training a sequential sampler of discrete objects under an unnormalized target density and have been successfully used for various probabilistic modeling tasks.

Properties of Minimizing Entropy

no code implementations6 Dec 2021 Xu Ji, Lena Nehale-Ezzine, Maksym Korablyov

Compact data representations are one approach for improving generalization of learned functions.

Flow Network based Generative Models for Non-Iterative Diverse Candidate Generation

4 code implementations NeurIPS 2021 Emmanuel Bengio, Moksh Jain, Maksym Korablyov, Doina Precup, Yoshua Bengio

Using insights from Temporal Difference learning, we propose GFlowNet, based on a view of the generative process as a flow network, making it possible to handle the tricky case where different trajectories can yield the same final state, e. g., there are many ways to sequentially add atoms to generate some molecular graph.

RetroGNN: Approximating Retrosynthesis by Graph Neural Networks for De Novo Drug Design

no code implementations25 Nov 2020 Cheng-Hao Liu, Maksym Korablyov, Stanisław Jastrzębski, Paweł Włodarczyk-Pruszyński, Yoshua Bengio, Marwin H. S. Segler

A natural idea to mitigate this problem is to bias the search process towards more easily synthesizable molecules using a proxy for synthetic accessibility.

Retrosynthesis

Capsule networks for low-data transfer learning

no code implementations26 Apr 2018 Andrew Gritsevskiy, Maksym Korablyov

We propose a capsule network-based architecture for generalizing learning to new data with few examples.

Transfer Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.