1 code implementation • 20 Jun 2022 • Matthew F. Dixon, Nicholas G. Polson, Kemen Goicoechea
This non-linear factor structure is extracted by using projected least squares to jointly project firm characteristics and asset returns on to a subspace of latent factors and using deep learning to learn the non-linear map from the factor loadings to the asset returns.
no code implementations • 29 May 2019 • Jingyu He, Nicholas G. Polson, Jianeng Xu
We use the theory of normal variance-mean mixtures to derive a data augmentation scheme for models that include gamma functions.
no code implementations • 24 Apr 2019 • Anindya Bhadra, Jyotishka Datta, Yunfan Li, Nicholas G. Polson
We also outline the recent computational developments in horseshoe shrinkage for complex models along with a list of available software implementations that allows one to venture out beyond the comfort zone of the canonical linear regression problems.
no code implementations • 22 Mar 2019 • Yuexi Wang, Nicholas G. Polson, Vadim O. Sokolov
Our methodology is compared to traditional stochastic gradient descent with back-propagation.
1 code implementation • 18 Mar 2019 • Matthew F. Dixon, Nicholas G. Polson
Deep fundamental factor models are developed to automatically capture non-linearity and interaction effects in factor modeling.
no code implementations • 17 Feb 2019 • Nicholas G. Polson, Vadim Sokolov
Bayesian regularization is a central tool in modern-day statistical and machine learning methods.
Methodology
no code implementations • 20 Jul 2018 • Nicholas G. Polson, Vadim O. Sokolov
Deep learning (DL) is a high dimensional data reduction technique for constructing high-dimensional predictors in input-output models.
2 code implementations • 3 May 2018 • Guanhao Feng, Nicholas G. Polson, Jianeng Xu
This paper presents an augmented deep factor model that generates latent factors for cross-sectional asset pricing.
Methodology
1 code implementation • 25 Apr 2018 • Guanhao Feng, Jingyu He, Nicholas G. Polson
Deep learning searches for nonlinear factors for predicting asset returns.
1 code implementation • 30 Jun 2017 • Anindya Bhadra, Jyotishka Datta, Nicholas G. Polson, Brandon T. Willard
The goal of our paper is to survey and contrast the major advances in two of the most commonly used high-dimensional techniques, namely, the Lasso and horseshoe regularization methodologies.
Methodology Primary 62J07, 62J05, Secondary 62H15, 62F03
no code implementations • 31 May 2017 • Nicholas G. Polson, Lei Sun
To illustrate our methodology, we provide simulation evidence and a real data example on the statistical properties and computational efficiency of SBR versus direct posterior sampling using spike-and-slab priors.
no code implementations • 27 May 2017 • Matthew F. Dixon, Nicholas G. Polson, Vadim O. Sokolov
Deep learning applies hierarchical layers of hidden variables to construct nonlinear high dimensional predictors.
no code implementations • 23 Feb 2017 • Anindya Bhadra, Jyotishka Datta, Nicholas G. Polson, Brandon Willard
Feature subset selection arises in many high-dimensional applications of statistics, such as compressed sensing and genomics.
no code implementations • 20 Sep 2015 • Nicholas G. Polson, Brandon T. Willard, Massoud Heidari
In this paper we develop a statistical theory and an implementation of deep learning models.
no code implementations • 11 Feb 2015 • Nicholas G. Polson, James G. Scott, Brandon T. Willard
We provide a discussion of convergence of non-descent algorithms with acceleration and for non-convex functions.
2 code implementations • 2 May 2014 • Jesse Windle, Nicholas G. Polson, James G. Scott
Efficiently sampling from the P\'olya-Gamma distribution, ${PG}(b, z)$, is an essential element of P\'olya-Gamma data augmentation.
Computation
2 code implementations • 2 May 2012 • Nicholas G. Polson, James G. Scott, Jesse Windle
We propose a new data-augmentation strategy for fully Bayesian inference in models with binomial likelihoods.