Search Results for author: David J. Nott

Found 10 papers, 0 papers with code

Dropout Regularization in Extended Generalized Linear Models based on Double Exponential Families

no code implementations11 May 2023 Benedikt Lütke Schwienhorst, Lucas Kock, David J. Nott, Nadja Klein

A theoretical analysis shows that dropout regularization prefers rare but important features in both the mean and dispersion, generalizing an earlier result for conventional generalized linear models.

Misspecification-robust Sequential Neural Likelihood for Simulation-based Inference

no code implementations31 Jan 2023 Ryan P. Kelly, David J. Nott, David T. Frazier, David J. Warne, Chris Drovandi

Simulation-based inference techniques are indispensable for parameter estimation of mechanistic and simulable models with intractable likelihoods.

Uncertainty Quantification

On a Variational Approximation based Empirical Likelihood ABC Method

no code implementations12 Nov 2020 Sanjay Chaudhuri, Subhroshekhar Ghosh, David J. Nott, Kim Cuc Pham

The expected log-likelihood is then estimated by an empirical likelihood where the only inputs required are a choice of summary statistic, it's observed value, and the ability to simulate the chosen summary statistics for any parameter value under the model.

Bayesian Inference

Deep Distributional Time Series Models and the Probabilistic Forecasting of Intraday Electricity Prices

no code implementations5 Oct 2020 Nadja Klein, Michael Stanley Smith, David J. Nott

Using data from the Australian National Electricity Market, we show that our deep time series models provide accurate short term probabilistic price forecasts, with the copula model dominating.

Time Series Time Series Analysis

Marginally-calibrated deep distributional regression

no code implementations26 Aug 2019 Nadja Klein, David J. Nott, Michael Stanley Smith

The end result is a scalable distributional DNN regression method with marginally calibrated predictions, and our work complements existing methods for probability calibration.

regression Time Series Analysis +1

An easy-to-use empirical likelihood ABC method

no code implementations3 Oct 2018 Sanjay Chaudhuri, Subhro Ghosh, David J. Nott, Kim Cuc Pham

Many scientifically well-motivated statistical models in natural, engineering and environmental sciences are specified through a generative process, but in some cases it may not be possible to write down a likelihood for these models analytically.

Bayesian Inference

Gaussian variational approximation for high-dimensional state space models

no code implementations24 Jan 2018 Matias Quiroz, David J. Nott, Robert Kohn

The variational parameters to be optimized are the mean vector and the covariance matrix of the approximation.

Vocal Bursts Intensity Prediction

Gaussian variational approximation with sparse precision matrices

no code implementations18 May 2016 Linda S. L. Tan, David J. Nott

We consider the problem of learning a Gaussian variational approximation to the posterior distribution for a high-dimensional parameter, where we impose sparsity in the precision matrix to reflect appropriate conditional independence structure in the model.

Time Series Time Series Analysis

Efficient variational inference for generalized linear mixed models with large datasets

no code implementations30 Jul 2013 David J. Nott, Minh-Ngoc Tran, Anthony Y. C. Kuk, Robert Kohn

We propose a divide and recombine strategy for the analysis of large datasets, which partitions a large dataset into smaller pieces and then combines the variational distributions that have been learnt in parallel on each separate piece using the hybrid Variational Bayes algorithm.

Methodology

Variational inference for sparse spectrum Gaussian process regression

no code implementations9 Jun 2013 Linda S. L. Tan, Victor M. H. Ong, David J. Nott, Ajay Jasra

We develop a fast variational approximation scheme for Gaussian process (GP) regression, where the spectrum of the covariance function is subjected to a sparse approximation.

Computation

Cannot find the paper you are looking for? You can Submit a new open access paper.