no code implementations • 24 Jun 2024 • Jyotishka Datta, Nicholas G. Polson
Our methodology also applies to flow-based methods for nonlinear feature extraction and deep learning.
no code implementations • 22 Oct 2021 • Anindya Bhadra, Jyotishka Datta, Nick Polson, Vadim Sokolov, Jianeng Xu
We show that prediction, interpolation and uncertainty quantification can be achieved using probabilistic methods at the output layer of the model.
no code implementations • 25 Feb 2021 • Nilabja Guha, Jyotishka Datta
We consider a hierarchical Bayesian linear model where the active set of covariates that affects the observations through a mean model can vary between different time segments.
Model Selection Variable Selection Methodology Statistics Theory Statistics Theory
no code implementations • 7 Oct 2020 • Souradip Chakraborty, Ekansh Verma, Saswata Sahoo, Jyotishka Datta
Representation Learning in a heterogeneous space with mixed variables of numerical and categorical types has interesting challenges due to its complex feature manifold.
no code implementations • 24 Apr 2019 • Anindya Bhadra, Jyotishka Datta, Yunfan Li, Nicholas G. Polson
We also outline the recent computational developments in horseshoe shrinkage for complex models along with a list of available software implementations that allows one to venture out beyond the comfort zone of the canonical linear regression problems.
1 code implementation • 30 Jun 2017 • Anindya Bhadra, Jyotishka Datta, Nicholas G. Polson, Brandon T. Willard
The goal of our paper is to survey and contrast the major advances in two of the most commonly used high-dimensional techniques, namely, the Lasso and horseshoe regularization methodologies.
Methodology Primary 62J07, 62J05, Secondary 62H15, 62F03
no code implementations • 23 Feb 2017 • Anindya Bhadra, Jyotishka Datta, Nicholas G. Polson, Brandon Willard
Feature subset selection arises in many high-dimensional applications of statistics, such as compressed sensing and genomics.