2 code implementations • 8 Jun 2020 • Benson Chen, Gary Bécigneul, Octavian-Eugen Ganea, Regina Barzilay, Tommi Jaakkola
Current graph neural network (GNN) architectures naively average or sum node embeddings into an aggregated graph representation -- potentially losing structural or semantic information.
Ranked #1 on Graph Regression on Lipophilicity (using extra training data)
no code implementations • 13 May 2020 • Louis Abraham, Gary Bécigneul, Bernhard Schölkopf
We study the problem usually referred to as group testing in the context of COVID-19.
1 code implementation • 20 Feb 2020 • Calin Cruceru, Gary Bécigneul, Octavian-Eugen Ganea
Representing graphs as sets of node embeddings in certain curved Riemannian manifolds has recently gained momentum in machine learning due to their desirable geometric inductive biases, e. g., hierarchical structures benefit from hyperbolic geometry.
no code implementations • 11 Feb 2020 • Foivos Alimisis, Antonio Orvieto, Gary Bécigneul, Aurelien Lucchi
We develop a new Riemannian descent algorithm with an accelerated rate of convergence.
Optimization and Control
1 code implementation • ICLR 2020 • Ondrej Skopek, Octavian-Eugen Ganea, Gary Bécigneul
Euclidean geometry has historically been the typical "workhorse" for machine learning applications due to its power and simplicity.
no code implementations • ICML 2020 • Gregor Bachmann, Gary Bécigneul, Octavian-Eugen Ganea
Interest has been rising lately towards methods representing data in non-Euclidean spaces, e. g. hyperbolic or spherical, that provide specific inductive biases useful for certain real-world data properties, e. g. scale-free, hierarchical or cyclical.
1 code implementation • 23 Oct 2019 • Foivos Alimisis, Antonio Orvieto, Gary Bécigneul, Aurelien Lucchi
We propose a novel second-order ODE as the continuous-time limit of a Riemannian accelerated gradient-based method on a manifold with curvature bounded from below.
Optimization and Control
no code implementations • 23 Jul 2019 • Octavian-Eugen Ganea, Yashas Annadani, Gary Bécigneul
We take steps towards understanding the "posterior collapse (PC)" difficulty in variational autoencoders (VAEs),~i. e.
no code implementations • ICLR 2019 • Yannic Kilcher, Gary Bécigneul, Thomas Hofmann
We develop our method for fully-connected as well as convolutional layers.
no code implementations • 21 Feb 2019 • Octavian-Eugen Ganea, Sylvain Gelly, Gary Bécigneul, Aliaksei Severyn
The Softmax function on top of a final linear layer is the de facto method to output probability distributions in neural networks.
1 code implementation • 15 Oct 2018 • Alexandru Tifrea, Gary Bécigneul, Octavian-Eugen Ganea
Words are not created equal.
1 code implementation • ICLR 2019 • Gary Bécigneul, Octavian-Eugen Ganea
Several first order stochastic optimization methods commonly used in the Euclidean domain such as stochastic gradient descent (SGD), accelerated gradient descent or variance reduced methods have already been adapted to certain Riemannian settings.
3 code implementations • NeurIPS 2018 • Octavian-Eugen Ganea, Gary Bécigneul, Thomas Hofmann
However, the representational power of hyperbolic geometry is not yet on par with Euclidean geometry, mostly because of the absence of corresponding hyperbolic neural network layers.
3 code implementations • ICML 2018 • Octavian-Eugen Ganea, Gary Bécigneul, Thomas Hofmann
Learning graph representations via low-dimensional embeddings that preserve relevant network properties is an important class of problems in machine learning.
Ranked #1 on Link Prediction on WordNet
no code implementations • 20 Mar 2017 • Gary Bécigneul
In machine learning and neuroscience, certain computational structures and algorithms are known to yield disentangled representations without us understanding why, the most striking examples being perhaps convolutional neural networks and the ventral stream of the visual cortex in humans and primates.