Search Results for author: Steve Kroon

Found 10 papers, 4 papers with code

Marginal Likelihoods from Monte Carlo Markov Chains

2 code implementations11 Apr 2017 Alan Heavens, Yabebal Fantaye, Arrykrishna Mootoovaloo, Hans Eggers, Zafiirah Hosenie, Steve Kroon, Elena Sellentin

In this paper, we present a method for computing the marginal likelihood, also known as the model likelihood or Bayesian evidence, from Markov Chain Monte Carlo (MCMC), or other sampled posterior distributions.

Computation Cosmology and Nongalactic Astrophysics

Critical initialisation for deep signal propagation in noisy rectifier neural networks

1 code implementation NeurIPS 2018 Arnu Pretorius, Elan van Biljon, Steve Kroon, Herman Kamper

Simulations and experiments on real-world data confirm that our proposed initialisation is able to stably propagate signals in deep networks, while using an initialisation disregarding noise fails to do so.

On the expected behaviour of noise regularised deep neural networks as Gaussian processes

no code implementations12 Oct 2019 Arnu Pretorius, Herman Kamper, Steve Kroon

Recent work has established the equivalence between deep neural networks and Gaussian processes (GPs), resulting in so-called neural network Gaussian processes (NNGPs).

Gaussian Processes

If dropout limits trainable depth, does critical initialisation still matter? A large-scale statistical analysis on ReLU networks

no code implementations13 Oct 2019 Arnu Pretorius, Elan van Biljon, Benjamin van Niekerk, Ryan Eloff, Matthew Reynard, Steve James, Benjamin Rosman, Herman Kamper, Steve Kroon

Our results therefore suggest that, in the shallow-to-moderate depth setting, critical initialisation provides zero performance gains when compared to off-critical initialisations and that searching for off-critical initialisations that might improve training speed or generalisation, is likely to be a fruitless endeavour.

Stabilising priors for robust Bayesian deep learning

no code implementations23 Oct 2019 Felix McGregor, Arnu Pretorius, Johan du Preez, Steve Kroon

Bayesian neural networks (BNNs) have developed into useful tools for probabilistic modelling due to recent advances in variational inference enabling large scale BNNs.

Variational Inference

Stochastic Gradient Annealed Importance Sampling for Efficient Online Marginal Likelihood Estimation

1 code implementation17 Nov 2019 Scott A. Cameron, Hans C. Eggers, Steve Kroon

An important benefit of our approach is that the marginal likelihood is calculated in an online fashion as data becomes available, allowing the estimates to be used for applications such as online weighted model combination.

Performance-Agnostic Fusion of Probabilistic Classifier Outputs

no code implementations1 Sep 2020 Jordan F. Masakuna, Simukai W. Utete, Steve Kroon

The intuition behind this approach is that classifiers trained for the same task should typically exhibit regularities in their outputs on a new task; the predictions of classifiers which differ significantly from those of others are thus given less credence using our approach.

SIReN-VAE: Leveraging Flows and Amortized Inference for Bayesian Networks

no code implementations23 Apr 2022 Jacobie Mouton, Steve Kroon

Subsequent work has explored incorporating more complex distributions and dependency structures: including normalizing flows in the encoder network allows latent variables to entangle non-linearly, creating a richer class of distributions for the approximate posterior, and stacking layers of latent variables allows more complex priors to be specified for the generative model.

Graphical Residual Flows

no code implementations23 Apr 2022 Jacobie Mouton, Steve Kroon

Furthermore, our model provides performance competitive with other graphical flows for both density estimation and inference tasks.

Density Estimation

Cannot find the paper you are looking for? You can Submit a new open access paper.