Search Results for author: Sergey Bartunov

Found 13 papers, 7 papers with code

Equilibrium Aggregation: Encoding Sets via Optimization

no code implementations25 Feb 2022 Sergey Bartunov, Fabian B. Fuchs, Timothy Lillicrap

Processing sets or other unordered, potentially variable-sized inputs in neural networks is usually handled by aggregating a number of input tensors into a single representation.

Molecular Property Prediction Property Prediction

Learning a Large Neighborhood Search Algorithm for Mixed Integer Programs

1 code implementation21 Jul 2021 Nicolas Sonnerat, Pengming Wang, Ira Ktena, Sergey Bartunov, Vinod Nair

Large Neighborhood Search (LNS) is a combinatorial optimization heuristic that starts with an assignment of values for the variables to be optimized, and iteratively improves it by searching a large neighborhood around the current assignment.

Combinatorial Optimization Imitation Learning

Computer-Aided Design as Language

no code implementations NeurIPS 2021 Yaroslav Ganin, Sergey Bartunov, Yujia Li, Ethan Keller, Stefano Saliceti

Computer-Aided Design (CAD) applications are used in manufacturing to model everything from coffee mugs to sports cars.

Language Modelling Translation

Continuous Latent Search for Combinatorial Optimization

no code implementations NeurIPS Workshop LMCA 2020 Sergey Bartunov, Vinod Nair, Peter Battaglia, Tim Lillicrap

Combinatorial optimization problems are notoriously hard because they often require enumeration of the exponentially large solution space.

Combinatorial Optimization

Meta-Learning Deep Energy-Based Memory Models

no code implementations ICLR 2020 Sergey Bartunov, Jack W. Rae, Simon Osindero, Timothy P. Lillicrap

We study the problem of learning associative memory -- a system which is able to retrieve a remembered pattern based on its distorted or incomplete version.

Meta-Learning Retrieval

Meta-Learning Neural Bloom Filters

no code implementations ICLR 2019 Jack W. Rae, Sergey Bartunov, Timothy P. Lillicrap

There has been a recent trend in training neural networks to replace data structures that have been crafted by hand, with an aim for faster execution, better accuracy, or greater compression.

Meta-Learning

Assessing the Scalability of Biologically-Motivated Deep Learning Algorithms and Architectures

1 code implementation NeurIPS 2018 Sergey Bartunov, Adam Santoro, Blake A. Richards, Luke Marris, Geoffrey E. Hinton, Timothy Lillicrap

Here we present results on scaling up biologically motivated models of deep learning on datasets which need deep networks with appropriate architectures to achieve good performance.

Adaptive Cardinality Estimation

1 code implementation22 Nov 2017 Oleg Ivanov, Sergey Bartunov

The experimental evaluation shows that this approach significantly increases the quality of cardinality estimation, and therefore increases the DBMS performance for some queries by several times or even by several dozens of times.

Fast Adaptation in Generative Models with Generative Matching Networks

no code implementations7 Dec 2016 Sergey Bartunov, Dmitry P. Vetrov

Despite recent advances, the remaining bottlenecks in deep generative models are necessity of extensive training and difficulties with generalization from small number of training examples.

Diversity One-Shot Learning

One-shot Learning with Memory-Augmented Neural Networks

11 code implementations19 May 2016 Adam Santoro, Sergey Bartunov, Matthew Botvinick, Daan Wierstra, Timothy Lillicrap

Despite recent breakthroughs in the applications of deep neural networks, one setting that presents a persistent challenge is that of "one-shot learning."

One-Shot Learning

Breaking Sticks and Ambiguities with Adaptive Skip-gram

3 code implementations25 Feb 2015 Sergey Bartunov, Dmitry Kondrashkin, Anton Osokin, Dmitry Vetrov

Recently proposed Skip-gram model is a powerful method for learning high-dimensional word representations that capture rich semantic relationships between words.

Word Sense Induction

Cannot find the paper you are looking for? You can Submit a new open access paper.