Search Results for author: Nicholas G. Polson

Found 17 papers, 7 papers with code

Deep Partial Least Squares for Empirical Asset Pricing

1 code implementation20 Jun 2022 Matthew F. Dixon, Nicholas G. Polson, Kemen Goicoechea

This non-linear factor structure is extracted by using projected least squares to jointly project firm characteristics and asset returns on to a subspace of latent factors and using deep learning to learn the non-linear map from the factor loadings to the asset returns.

Data Augementation with Polya Inverse Gamma

no code implementations29 May 2019 Jingyu He, Nicholas G. Polson, Jianeng Xu

We use the theory of normal variance-mean mixtures to derive a data augmentation scheme for models that include gamma functions.

Bayesian Inference Data Augmentation +1

Horseshoe Regularization for Machine Learning in Complex and Deep Models

no code implementations24 Apr 2019 Anindya Bhadra, Jyotishka Datta, Yunfan Li, Nicholas G. Polson

We also outline the recent computational developments in horseshoe shrinkage for complex models along with a list of available software implementations that allows one to venture out beyond the comfort zone of the canonical linear regression problems.

BIG-bench Machine Learning regression

Data Augmentation for Bayesian Deep Learning

no code implementations22 Mar 2019 Yuexi Wang, Nicholas G. Polson, Vadim O. Sokolov

Our methodology is compared to traditional stochastic gradient descent with back-propagation.

Data Augmentation Uncertainty Quantification

Deep Fundamental Factor Models

1 code implementation18 Mar 2019 Matthew F. Dixon, Nicholas G. Polson

Deep fundamental factor models are developed to automatically capture non-linearity and interaction effects in factor modeling.

Uncertainty Quantification

Bayesian Regularization: From Tikhonov to Horseshoe

no code implementations17 Feb 2019 Nicholas G. Polson, Vadim Sokolov

Bayesian regularization is a central tool in modern-day statistical and machine learning methods.

Methodology

Deep Learning

no code implementations20 Jul 2018 Nicholas G. Polson, Vadim O. Sokolov

Deep learning (DL) is a high dimensional data reduction technique for constructing high-dimensional predictors in input-output models.

BIG-bench Machine Learning

Deep Learning in Characteristics-Sorted Factor Models

2 code implementations3 May 2018 Guanhao Feng, Nicholas G. Polson, Jianeng Xu

This paper presents an augmented deep factor model that generates latent factors for cross-sectional asset pricing.

Methodology

Lasso Meets Horseshoe

1 code implementation30 Jun 2017 Anindya Bhadra, Jyotishka Datta, Nicholas G. Polson, Brandon T. Willard

The goal of our paper is to survey and contrast the major advances in two of the most commonly used high-dimensional techniques, namely, the Lasso and horseshoe regularization methodologies.

Methodology Primary 62J07, 62J05, Secondary 62H15, 62F03

Bayesian $l_0$-regularized Least Squares

no code implementations31 May 2017 Nicholas G. Polson, Lei Sun

To illustrate our methodology, we provide simulation evidence and a real data example on the statistical properties and computational efficiency of SBR versus direct posterior sampling using spike-and-slab priors.

Computational Efficiency Variable Selection

Deep Learning for Spatio-Temporal Modeling: Dynamic Traffic Flows and High Frequency Trading

no code implementations27 May 2017 Matthew F. Dixon, Nicholas G. Polson, Vadim O. Sokolov

Deep learning applies hierarchical layers of hidden variables to construct nonlinear high dimensional predictors.

General Classification

Horseshoe Regularization for Feature Subset Selection

no code implementations23 Feb 2017 Anindya Bhadra, Jyotishka Datta, Nicholas G. Polson, Brandon Willard

Feature subset selection arises in many high-dimensional applications of statistics, such as compressed sensing and genomics.

Uncertainty Quantification

A Statistical Theory of Deep Learning via Proximal Splitting

no code implementations20 Sep 2015 Nicholas G. Polson, Brandon T. Willard, Massoud Heidari

In this paper we develop a statistical theory and an implementation of deep learning models.

Model Selection

Proximal Algorithms in Statistics and Machine Learning

no code implementations11 Feb 2015 Nicholas G. Polson, James G. Scott, Brandon T. Willard

We provide a discussion of convergence of non-descent algorithms with acceleration and for non-convex functions.

BIG-bench Machine Learning regression

Sampling Polya-Gamma random variates: alternate and approximate techniques

2 code implementations2 May 2014 Jesse Windle, Nicholas G. Polson, James G. Scott

Efficiently sampling from the P\'olya-Gamma distribution, ${PG}(b, z)$, is an essential element of P\'olya-Gamma data augmentation.

Computation

Cannot find the paper you are looking for? You can Submit a new open access paper.