Search Results for author: Stratis Markou

Found 9 papers, 4 papers with code

Denoising Diffusion Probabilistic Models in Six Simple Steps

no code implementations6 Feb 2024 Richard E. Turner, Cristiana-Diana Diaconu, Stratis Markou, Aliaksandra Shysheya, Andrew Y. K. Foong, Bruno Mlodozeniec

Denoising Diffusion Probabilistic Models (DDPMs) are a very popular class of deep generative model that have been successfully applied to a diverse range of problems including image and video generation, protein and material synthesis, weather forecasting, and neural surrogates of partial differential equations.

Denoising Video Generation +1

Autoregressive Conditional Neural Processes

1 code implementation25 Mar 2023 Wessel P. Bruinsma, Stratis Markou, James Requiema, Andrew Y. K. Foong, Tom R. Andersson, Anna Vaughan, Anthony Buonomo, J. Scott Hosking, Richard E. Turner

Our work provides an example of how ideas from neural distribution estimation can benefit neural processes, and motivates research into the AR deployment of other neural process models.

Meta-Learning

Practical Conditional Neural Processes Via Tractable Dependent Predictions

no code implementations16 Mar 2022 Stratis Markou, James Requeima, Wessel P. Bruinsma, Anna Vaughan, Richard E. Turner

Existing approaches which model output dependencies, such as Neural Processes (NPs; Garnelo et al., 2018b) or the FullConvGNP (Bruinsma et al., 2021), are either complicated to train or prohibitively expensive.

Decision Making Meta-Learning

Practical Conditional Neural Process Via Tractable Dependent Predictions

no code implementations ICLR 2022 Stratis Markou, James Requeima, Wessel Bruinsma, Anna Vaughan, Richard E Turner

Existing approaches which model output dependencies, such as Neural Processes (NPs; Garnelo et al., 2018) or the FullConvGNP (Bruinsma et al., 2021), are either complicated to train or prohibitively expensive.

Decision Making Meta-Learning

Efficient Gaussian Neural Processes for Regression

no code implementations22 Aug 2021 Stratis Markou, James Requeima, Wessel Bruinsma, Richard Turner

Conditional Neural Processes (CNP; Garnelo et al., 2018) are an attractive family of meta-learning models which produce well-calibrated predictions, enable fast inference at test time, and are trainable via a simple maximum likelihood procedure.

Decision Making Meta-Learning +1

Cannot find the paper you are looking for? You can Submit a new open access paper.