Search Results for author: Robert Kohn

Found 17 papers, 7 papers with code

Contextual Directed Acyclic Graphs

1 code implementation24 Oct 2023 Ryan Thompson, Edwin V. Bonilla, Robert Kohn

Estimating the structure of directed acyclic graphs (DAGs) from observational data remains a significant challenge in machine learning.

DeepVol: A Pre-Trained Universal Asset Volatility Model

1 code implementation5 Sep 2023 Chen Liu, Minh-Ngoc Tran, Chao Wang, Richard Gerlach, Robert Kohn

This paper introduces DeepVol, a pre-trained deep learning volatility model that is more general than traditional econometric models.

Econometrics Transfer Learning

Particle Mean Field Variational Bayes

1 code implementation24 Mar 2023 Minh-Ngoc Tran, Paco Tseng, Robert Kohn

The Mean Field Variational Bayes (MFVB) method is one of the most computationally efficient techniques for Bayesian inference.

Bayesian Inference regression

Deep Learning Enhanced Realized GARCH

1 code implementation16 Feb 2023 Chen Liu, Chao Wang, Minh-Ngoc Tran, Robert Kohn

We propose a new approach to volatility modeling by combining deep learning (LSTM) and realized volatility measures.

Bayesian Inference Econometrics

The Contextual Lasso: Sparse Linear Models via Deep Neural Networks

1 code implementation NeurIPS 2023 Ryan Thompson, Amir Dezfouli, Robert Kohn

With this capability gap in mind, we study a not-uncommon situation where the input features dichotomize into two groups: explanatory features, which are candidates for inclusion as variables in an interpretable model, and contextual features, which select from the candidate variables and determine their effects.

Decision Making Interpretable Machine Learning

Hidden Group Time Profiles: Heterogeneous Drawdown Behaviours in Retirement

no code implementations3 Sep 2020 Igor Balnozan, Denzil G. Fiebig, Anthony Asher, Robert Kohn, Scott A. Sisson

This article investigates retirement decumulation behaviours using the Grouped Fixed-Effects (GFE) estimator applied to Australian panel data on drawdowns from phased withdrawal retirement income products.

Variance reduction properties of the reparameterization trick

no code implementations27 Sep 2018 Ming Xu, Matias Quiroz, Robert Kohn, Scott A. Sisson

From this, we show that the marginal variances of the reparameterization gradient estimator are smaller than those of the score function gradient estimator.

Variational Inference

Subsampling MCMC - An introduction for the survey statistician

no code implementations23 Jul 2018 Matias Quiroz, Mattias Villani, Robert Kohn, Minh-Ngoc Tran, Khue-Dung Dang

The rapid development of computing power and efficient Markov Chain Monte Carlo (MCMC) simulation algorithms have revolutionized Bayesian statistics, making it a highly practical inference method in applied work.

Survey Sampling

Efficient data augmentation for multivariate probit models with panel data: An application to general practitioner decision-making about contraceptives

1 code implementation19 Jun 2018 Vincent Chin, David Gunawan, Denzil G. Fiebig, Robert Kohn, Scott A. Sisson

This article considers the problem of estimating a multivariate probit model in a panel data setting with emphasis on sampling a high-dimensional correlation matrix and improving the overall efficiency of the data augmentation approach.

Computation Applications Methodology

Bayesian Deep Net GLM and GLMM

2 code implementations25 May 2018 Minh-Ngoc Tran, Nghia Nguyen, David Nott, Robert Kohn

Efficient computational methods for high-dimensional Bayesian inference are developed using Gaussian variational approximation, with a parsimonious but flexible factor parametrization of the covariance matrix.

Computation

Subsampling Sequential Monte Carlo for Static Bayesian Models

no code implementations8 May 2018 David Gunawan, Khue-Dung Dang, Matias Quiroz, Robert Kohn, Minh-Ngoc Tran

SMC sequentially updates a cloud of particles through a sequence of distributions, beginning with a distribution that is easy to sample from such as the prior and ending with the posterior distribution.

Bayesian Inference

Gaussian variational approximation for high-dimensional state space models

no code implementations24 Jan 2018 Matias Quiroz, David J. Nott, Robert Kohn

The variational parameters to be optimized are the mean vector and the covariance matrix of the approximation.

Vocal Bursts Intensity Prediction

Hamiltonian Monte Carlo with Energy Conserving Subsampling

no code implementations2 Aug 2017 Khue-Dung Dang, Matias Quiroz, Robert Kohn, Minh-Ngoc Tran, Mattias Villani

The key insight in our article is that efficient subsampling HMC for the parameters is possible if both the dynamics and the acceptance probability are computed from the same data subsample in each complete HMC iteration.

The block-Poisson estimator for optimally tuned exact subsampling MCMC

no code implementations27 Mar 2016 Matias Quiroz, Minh-Ngoc Tran, Mattias Villani, Robert Kohn, Khue-Dung Dang

A pseudo-marginal MCMC method is proposed that estimates the likelihood by data subsampling using a block-Poisson estimator.

Scalable MCMC for Large Data Problems using Data Subsampling and the Difference Estimator

no code implementations10 Jul 2015 Matias Quiroz, Mattias Villani, Robert Kohn

We propose a generic Markov Chain Monte Carlo (MCMC) algorithm to speed up computations for datasets with many observations.

Survey Sampling

Speeding Up MCMC by Efficient Data Subsampling

no code implementations16 Apr 2014 Matias Quiroz, Robert Kohn, Mattias Villani, Minh-Ngoc Tran

We propose Subsampling MCMC, a Markov Chain Monte Carlo (MCMC) framework where the likelihood function for $n$ observations is estimated from a random subset of $m$ observations.

Efficient variational inference for generalized linear mixed models with large datasets

no code implementations30 Jul 2013 David J. Nott, Minh-Ngoc Tran, Anthony Y. C. Kuk, Robert Kohn

We propose a divide and recombine strategy for the analysis of large datasets, which partitions a large dataset into smaller pieces and then combines the variational distributions that have been learnt in parallel on each separate piece using the hybrid Variational Bayes algorithm.

Methodology

Cannot find the paper you are looking for? You can Submit a new open access paper.