Search Results for author: Bruno Mlodozeniec

Found 7 papers, 3 papers with code

Warm Start Marginal Likelihood Optimisation for Iterative Gaussian Processes

no code implementations28 May 2024 Jihao Andreas Lin, Shreyas Padhy, Bruno Mlodozeniec, José Miguel Hernández-Lobato

Gaussian processes are a versatile probabilistic machine learning model whose effectiveness often depends on good hyperparameters, which are typically learned by maximising the marginal likelihood.

Denoising Diffusion Probabilistic Models in Six Simple Steps

no code implementations6 Feb 2024 Richard E. Turner, Cristiana-Diana Diaconu, Stratis Markou, Aliaksandra Shysheya, Andrew Y. K. Foong, Bruno Mlodozeniec

Denoising Diffusion Probabilistic Models (DDPMs) are a very popular class of deep generative model that have been successfully applied to a diverse range of problems including image and video generation, protein and material synthesis, weather forecasting, and neural surrogates of partial differential equations.

Denoising Video Generation +1

Implicit meta-learning may lead language models to trust more reliable sources

1 code implementation23 Oct 2023 Dmitrii Krasheninnikov, Egor Krasheninnikov, Bruno Mlodozeniec, Tegan Maharaj, David Krueger

Fine-tuning on this dataset leads to implicit meta-learning (IML): in further fine-tuning, the model updates to make more use of text that is tagged as useful.

In-Context Learning Meta-Learning

Timewarp: Transferable Acceleration of Molecular Dynamics by Learning Time-Coarsened Dynamics

1 code implementation NeurIPS 2023 Leon Klein, Andrew Y. K. Foong, Tor Erlend Fjelde, Bruno Mlodozeniec, Marc Brockschmidt, Sebastian Nowozin, Frank Noé, Ryota Tomioka

Molecular dynamics (MD) simulation is a widely used technique to simulate molecular systems, most commonly at the all-atom resolution where equations of motion are integrated with timesteps on the order of femtoseconds ($1\textrm{fs}=10^{-15}\textrm{s}$).

Ensemble Distribution Distillation

1 code implementation ICLR 2020 Andrey Malinin, Bruno Mlodozeniec, Mark Gales

The properties of EnD$^2$ are investigated on both an artificial dataset, and on the CIFAR-10, CIFAR-100 and TinyImageNet datasets, where it is shown that EnD$^2$ can approach the classification performance of an ensemble, and outperforms both standard DNNs and Ensemble Distillation on the tasks of misclassification and out-of-distribution input detection.

Cannot find the paper you are looking for? You can Submit a new open access paper.