1 code implementation • 24 May 2024 • Ryan Thompson, Edwin V. Bonilla, Robert Kohn
Directed acyclic graph (DAG) learning is a rapidly expanding field of research.
1 code implementation • 24 Oct 2023 • Ryan Thompson, Edwin V. Bonilla, Robert Kohn
Estimating the structure of directed acyclic graphs (DAGs) from observational data remains a significant challenge in machine learning.
1 code implementation • 5 Sep 2023 • Chen Liu, Minh-Ngoc Tran, Chao Wang, Richard Gerlach, Robert Kohn
For years, researchers investigated the applications of deep learning in forecasting financial time series.
1 code implementation • 24 Mar 2023 • Minh-Ngoc Tran, Paco Tseng, Robert Kohn
The Mean Field Variational Bayes (MFVB) method is one of the most computationally efficient techniques for Bayesian inference.
1 code implementation • 16 Feb 2023 • Chen Liu, Chao Wang, Minh-Ngoc Tran, Robert Kohn
We propose a new approach to volatility modeling by combining deep learning (LSTM) and realized volatility measures.
no code implementations • NeurIPS 2023 • Ryan Thompson, Amir Dezfouli, Robert Kohn
With this capability gap in mind, we study a not-uncommon situation where the input features dichotomize into two groups: explanatory features, which are candidates for inclusion as variables in an interpretable model, and contextual features, which select from the candidate variables and determine their effects.
no code implementations • 3 Sep 2020 • Igor Balnozan, Denzil G. Fiebig, Anthony Asher, Robert Kohn, Scott A. Sisson
This article investigates retirement decumulation behaviours using the Grouped Fixed-Effects (GFE) estimator applied to Australian panel data on drawdowns from phased withdrawal retirement income products.
no code implementations • 27 Sep 2018 • Ming Xu, Matias Quiroz, Robert Kohn, Scott A. Sisson
From this, we show that the marginal variances of the reparameterization gradient estimator are smaller than those of the score function gradient estimator.
no code implementations • 23 Jul 2018 • Matias Quiroz, Mattias Villani, Robert Kohn, Minh-Ngoc Tran, Khue-Dung Dang
The rapid development of computing power and efficient Markov Chain Monte Carlo (MCMC) simulation algorithms have revolutionized Bayesian statistics, making it a highly practical inference method in applied work.
1 code implementation • 19 Jun 2018 • Vincent Chin, David Gunawan, Denzil G. Fiebig, Robert Kohn, Scott A. Sisson
This article considers the problem of estimating a multivariate probit model in a panel data setting with emphasis on sampling a high-dimensional correlation matrix and improving the overall efficiency of the data augmentation approach.
Computation Applications Methodology
2 code implementations • 25 May 2018 • Minh-Ngoc Tran, Nghia Nguyen, David Nott, Robert Kohn
Efficient computational methods for high-dimensional Bayesian inference are developed using Gaussian variational approximation, with a parsimonious but flexible factor parametrization of the covariance matrix.
Computation
no code implementations • 8 May 2018 • David Gunawan, Khue-Dung Dang, Matias Quiroz, Robert Kohn, Minh-Ngoc Tran
SMC sequentially updates a cloud of particles through a sequence of distributions, beginning with a distribution that is easy to sample from such as the prior and ending with the posterior distribution.
no code implementations • 24 Jan 2018 • Matias Quiroz, David J. Nott, Robert Kohn
The variational parameters to be optimized are the mean vector and the covariance matrix of the approximation.
no code implementations • 2 Aug 2017 • Khue-Dung Dang, Matias Quiroz, Robert Kohn, Minh-Ngoc Tran, Mattias Villani
The key insight in our article is that efficient subsampling HMC for the parameters is possible if both the dynamics and the acceptance probability are computed from the same data subsample in each complete HMC iteration.
no code implementations • 27 Mar 2016 • Matias Quiroz, Minh-Ngoc Tran, Mattias Villani, Robert Kohn, Khue-Dung Dang
A pseudo-marginal MCMC method is proposed that estimates the likelihood by data subsampling using a block-Poisson estimator.
no code implementations • 10 Jul 2015 • Matias Quiroz, Mattias Villani, Robert Kohn
We propose a generic Markov Chain Monte Carlo (MCMC) algorithm to speed up computations for datasets with many observations.
no code implementations • 16 Apr 2014 • Matias Quiroz, Robert Kohn, Mattias Villani, Minh-Ngoc Tran
We propose Subsampling MCMC, a Markov Chain Monte Carlo (MCMC) framework where the likelihood function for $n$ observations is estimated from a random subset of $m$ observations.
no code implementations • 30 Jul 2013 • David J. Nott, Minh-Ngoc Tran, Anthony Y. C. Kuk, Robert Kohn
We propose a divide and recombine strategy for the analysis of large datasets, which partitions a large dataset into smaller pieces and then combines the variational distributions that have been learnt in parallel on each separate piece using the hybrid Variational Bayes algorithm.
Methodology