no code implementations • 7 Jun 2024 • Virginia Aglietti, Ira Ktena, Jessica Schrouff, Eleni Sgouritsa, Francisco J. R. Ruiz, Alan Malek, Alexis Bellot, Silvia Chiappa
The sample efficiency of Bayesian optimization algorithms depends on carefully crafted acquisition functions (AFs) guiding the sequential collection of function evaluations.
no code implementations • 22 Feb 2024 • Francisco J. R. Ruiz, Tuomas Laakkonen, Johannes Bausch, Matej Balog, Mohammadamin Barekatain, Francisco J. H. Heras, Alexander Novikov, Nathan Fitzpatrick, Bernardino Romera-Paredes, John van de Wetering, Alhussein Fawzi, Konstantinos Meichanetzidis, Pushmeet Kohli
A key challenge in realizing fault-tolerant quantum computers is circuit optimization.
2 code implementations • Nature 2022 • Alhussein Fawzi, Matej Balog, Aja Huang, Thomas Hubert, Bernardino Romera-Paredes, Mohammadamin Barekatain, Alexander Novikov, Francisco J. R. Ruiz, Julian Schrittwieser, Grzegorz Swirszcz, David Silver, Demis Hassabis, Pushmeet Kohli
Particularly relevant is the case of 4 × 4 matrices in a finite field, where AlphaTensor’s algorithm improves on Strassen’s two-level algorithm for the first time, to our knowledge, since its discovery 50 years ago2.
1 code implementation • 11 Jun 2021 • Xiaohui Chen, Xu Han, Jiajing Hu, Francisco J. R. Ruiz, LiPing Liu
A graph generative model defines a distribution over graphs.
1 code implementation • NeurIPS 2020 • Lorenz Richter, Ayman Boustati, Nikolas Nüsken, Francisco J. R. Ruiz, Ömer Deniz Akyildiz
We analyse the properties of an unbiased gradient estimator of the ELBO for variational inference, based on the score function method with leave-one-out control variates.
no code implementations • 5 Oct 2020 • Francisco J. R. Ruiz, Michalis K. Titsias, Taylan Cemgil, Arnaud Doucet
The variational auto-encoder (VAE) is a deep latent variable model that has two neural networks in an autoencoder-like architecture; one of them parameterizes the model's likelihood.
no code implementations • 7 Sep 2020 • Michalis K. Titsias, Francisco J. R. Ruiz, Sotirios Nikoloutsopoulos, Alexandre Galashov
We formulate meta learning using information theoretic concepts; namely, mutual information and the information bottleneck.
2 code implementations • 9 Oct 2019 • Adji B. Dieng, Francisco J. R. Ruiz, David M. Blei, Michalis K. Titsias
Generative adversarial networks (GANs) are a powerful approach to unsupervised learning.
Ranked #2 on
Image Generation
on Stacked MNIST
1 code implementation • 12 Jul 2019 • Adji B. Dieng, Francisco J. R. Ruiz, David M. Blei
Topic modeling analyzes documents to learn meaningful patterns of words.
11 code implementations • TACL 2020 • Adji B. Dieng, Francisco J. R. Ruiz, David M. Blei
To this end, we develop the Embedded Topic Model (ETM), a generative model of documents that marries traditional topic models with word embeddings.
Ranked #4 on
Topic Models
on AG News
2 code implementations • 10 May 2019 • Francisco J. R. Ruiz, Michalis K. Titsias
We develop a method to combine Markov chain Monte Carlo (MCMC) and variational inference (VI), leveraging the advantages of both inference approaches.
no code implementations • 7 Nov 2018 • Maryam Fatemi, Karl Granström, Lennart Svensson, Francisco J. R. Ruiz, Lars Hammarstrand
The proposed method can handle uncertainties in the data associations and the cardinality of the set of landmarks, and is parallelizable, making it suitable for large-scale problems.
no code implementations • 18 Oct 2018 • Francisco J. R. Ruiz, Isabel Valera, Lennart Svensson, Fernando Perez-Cruz
New communication standards need to deal with machine-to-machine communications, in which users may start or stop transmitting at any time in an asynchronous manner.
1 code implementation • 6 Aug 2018 • Michalis K. Titsias, Francisco J. R. Ruiz
We develop unbiased implicit variational inference (UIVI), a method that expands the applicability of variational inference by defining an expressive variational family.
1 code implementation • ICML 2018 • Francisco J. R. Ruiz, Michalis K. Titsias, Adji B. Dieng, David M. Blei
It maximizes a lower bound on the marginal likelihood of the data.
3 code implementations • 9 Nov 2017 • Francisco J. R. Ruiz, Susan Athey, David M. Blei
We develop SHOPPER, a sequential probabilistic model of shopping data.
2 code implementations • 18 Oct 2016 • Christian A. Naesseth, Francisco J. R. Ruiz, Scott W. Linderman, David M. Blei
Variational inference using the reparameterization trick has enabled large-scale approximate Bayesian inference in complex probabilistic models, leveraging stochastic optimization to sidestep intractable expectations.
no code implementations • NeurIPS 2016 • Francisco J. R. Ruiz, Michalis K. Titsias, David M. Blei
The reparameterization gradient has become a widely used method to obtain Monte Carlo gradients to optimize the variational objective.
no code implementations • NeurIPS 2016 • Maja R. Rudolph, Francisco J. R. Ruiz, Stephan Mandt, David M. Blei
In this paper, we develop exponential family embeddings, a class of methods that extends the idea of word embeddings to other types of high-dimensional data.
no code implementations • 3 Mar 2016 • Francisco J. R. Ruiz, Michalis K. Titsias, David M. Blei
Instead of taking samples from the variational distribution, we use importance sampling to take samples from an overdispersed distribution in the same exponential family as the variational approximation.
no code implementations • 29 Jan 2014 • Francisco J. R. Ruiz, Isabel Valera, Carlos Blanco, Fernando Perez-Cruz
To this end, we use the large amount of information collected in the National Epidemiologic Survey on Alcohol and Related Conditions (NESARC) database and propose to model these data using a nonparametric latent model based on the Indian Buffet Process (IBP).