Search Results for author: Gary Bécigneul

Found 15 papers, 8 papers with code

Optimal Transport Graph Neural Networks

2 code implementations8 Jun 2020 Benson Chen, Gary Bécigneul, Octavian-Eugen Ganea, Regina Barzilay, Tommi Jaakkola

Current graph neural network (GNN) architectures naively average or sum node embeddings into an aggregated graph representation -- potentially losing structural or semantic information.

 Ranked #1 on Graph Regression on Lipophilicity (using extra training data)

Drug Discovery Graph Regression +2

Crackovid: Optimizing Group Testing

no code implementations13 May 2020 Louis Abraham, Gary Bécigneul, Bernhard Schölkopf

We study the problem usually referred to as group testing in the context of COVID-19.

Computationally Tractable Riemannian Manifolds for Graph Embeddings

1 code implementation20 Feb 2020 Calin Cruceru, Gary Bécigneul, Octavian-Eugen Ganea

Representing graphs as sets of node embeddings in certain curved Riemannian manifolds has recently gained momentum in machine learning due to their desirable geometric inductive biases, e. g., hierarchical structures benefit from hyperbolic geometry.

BIG-bench Machine Learning

Practical Accelerated Optimization on Riemannian Manifolds

no code implementations11 Feb 2020 Foivos Alimisis, Antonio Orvieto, Gary Bécigneul, Aurelien Lucchi

We develop a new Riemannian descent algorithm with an accelerated rate of convergence.

Optimization and Control

Mixed-curvature Variational Autoencoders

1 code implementation ICLR 2020 Ondrej Skopek, Octavian-Eugen Ganea, Gary Bécigneul

Euclidean geometry has historically been the typical "workhorse" for machine learning applications due to its power and simplicity.

Constant Curvature Graph Convolutional Networks

no code implementations ICML 2020 Gregor Bachmann, Gary Bécigneul, Octavian-Eugen Ganea

Interest has been rising lately towards methods representing data in non-Euclidean spaces, e. g. hyperbolic or spherical, that provide specific inductive biases useful for certain real-world data properties, e. g. scale-free, hierarchical or cyclical.

Node Classification

A Continuous-time Perspective for Modeling Acceleration in Riemannian Optimization

1 code implementation23 Oct 2019 Foivos Alimisis, Antonio Orvieto, Gary Bécigneul, Aurelien Lucchi

We propose a novel second-order ODE as the continuous-time limit of a Riemannian accelerated gradient-based method on a manifold with curvature bounded from below.

Optimization and Control

Noise Contrastive Variational Autoencoders

no code implementations23 Jul 2019 Octavian-Eugen Ganea, Yashas Annadani, Gary Bécigneul

We take steps towards understanding the "posterior collapse (PC)" difficulty in variational autoencoders (VAEs),~i. e.

Breaking the Softmax Bottleneck via Learnable Monotonic Pointwise Non-linearities

no code implementations21 Feb 2019 Octavian-Eugen Ganea, Sylvain Gelly, Gary Bécigneul, Aliaksei Severyn

The Softmax function on top of a final linear layer is the de facto method to output probability distributions in neural networks.

Language Modelling Text Generation

Riemannian Adaptive Optimization Methods

1 code implementation ICLR 2019 Gary Bécigneul, Octavian-Eugen Ganea

Several first order stochastic optimization methods commonly used in the Euclidean domain such as stochastic gradient descent (SGD), accelerated gradient descent or variance reduced methods have already been adapted to certain Riemannian settings.

Riemannian optimization Stochastic Optimization

Hyperbolic Neural Networks

3 code implementations NeurIPS 2018 Octavian-Eugen Ganea, Gary Bécigneul, Thomas Hofmann

However, the representational power of hyperbolic geometry is not yet on par with Euclidean geometry, mostly because of the absence of corresponding hyperbolic neural network layers.

Graph Representation Learning Natural Language Inference +2

Hyperbolic Entailment Cones for Learning Hierarchical Embeddings

3 code implementations ICML 2018 Octavian-Eugen Ganea, Gary Bécigneul, Thomas Hofmann

Learning graph representations via low-dimensional embeddings that preserve relevant network properties is an important class of problems in machine learning.

Graph Embedding Hypernym Discovery +2

On the effect of pooling on the geometry of representations

no code implementations20 Mar 2017 Gary Bécigneul

In machine learning and neuroscience, certain computational structures and algorithms are known to yield disentangled representations without us understanding why, the most striking examples being perhaps convolutional neural networks and the ventral stream of the visual cortex in humans and primates.

Cannot find the paper you are looking for? You can Submit a new open access paper.