Variable Selection
116 papers with code • 0 benchmarks • 0 datasets
Benchmarks
These leaderboards are used to track progress in Variable Selection
Libraries
Use these libraries to find Variable Selection models and implementationsMost implemented papers
Deep Knockoffs
This paper introduces a machine for sampling approximate model-X knockoffs for arbitrary and unspecified data distributions using deep generative models.
Exact Combinatorial Optimization with Graph Convolutional Neural Networks
Combinatorial optimization problems are typically tackled by the branch-and-bound paradigm.
BART: Bayesian additive regression trees
We develop a Bayesian "sum-of-trees" model where each tree is constrained by a regularization prior to be a weak learner, and fitting and inference are accomplished via an iterative Bayesian backfitting MCMC algorithm that generates samples from a posterior.
Bolasso: model consistent Lasso estimation through the bootstrap
For various decays of the regularization parameter, we compute asymptotic equivalents of the probability of correct model selection (i. e., variable selection).
Variable selection for Gaussian processes via sensitivity analysis of the posterior predictive distribution
Variable selection for Gaussian process models is often done using automatic relevance determination, which uses the inverse length-scale parameter of each input variable as a proxy for variable relevance.
Iteratively Reweighted $\ell_1$-Penalized Robust Regression
This paper investigates tradeoffs among optimization errors, statistical rates of convergence and the effect of heavy-tailed errors for high-dimensional robust regression with nonconvex regularization.
Reversible Jump PDMP Samplers for Variable Selection
A new class of Markov chain Monte Carlo (MCMC) algorithms, based on simulating piecewise deterministic Markov processes (PDMPs), have recently shown great promise: they are non-reversible, can mix better than standard MCMC algorithms, and can use subsampling ideas to speed up computation in big data scenarios.
Post-selection inference with HSIC-Lasso
Detecting influential features in non-linear and/or high-dimensional data is a challenging and increasingly important task in machine learning.
Derandomizing Knockoffs
Model-X knockoffs is a general procedure that can leverage any feature importance measure to produce a variable selection algorithm, which discovers true effects while rigorously controlling the number or fraction of false positives.
abess: A Fast Best Subset Selection Library in Python and R
In addition, a user-friendly R library is available at the Comprehensive R Archive Network.