# Variable Selection

132 papers with code • 0 benchmarks • 0 datasets

## Benchmarks

These leaderboards are used to track progress in Variable Selection
## Libraries

Use these libraries to find Variable Selection models and implementations## Most implemented papers

# Exact Combinatorial Optimization with Graph Convolutional Neural Networks

Combinatorial optimization problems are typically tackled by the branch-and-bound paradigm.

# Deep Knockoffs

This paper introduces a machine for sampling approximate model-X knockoffs for arbitrary and unspecified data distributions using deep generative models.

# BART: Bayesian additive regression trees

We develop a Bayesian "sum-of-trees" model where each tree is constrained by a regularization prior to be a weak learner, and fitting and inference are accomplished via an iterative Bayesian backfitting MCMC algorithm that generates samples from a posterior.

# Trading with the Momentum Transformer: An Intelligent and Interpretable Architecture

We introduce the Momentum Transformer, an attention-based deep-learning architecture, which outperforms benchmark time-series momentum and mean-reversion trading strategies.

# Neural interval-censored survival regression with feature selection

Survival analysis is a fundamental area of focus in biomedical research, particularly in the context of personalized medicine.

# Bolasso: model consistent Lasso estimation through the bootstrap

For various decays of the regularization parameter, we compute asymptotic equivalents of the probability of correct model selection (i. e., variable selection).

# Variable selection for Gaussian processes via sensitivity analysis of the posterior predictive distribution

Variable selection for Gaussian process models is often done using automatic relevance determination, which uses the inverse length-scale parameter of each input variable as a proxy for variable relevance.

# Iteratively Reweighted $\ell_1$-Penalized Robust Regression

This paper investigates tradeoffs among optimization errors, statistical rates of convergence and the effect of heavy-tailed errors for high-dimensional robust regression with nonconvex regularization.

# Post-selection inference with HSIC-Lasso

Detecting influential features in non-linear and/or high-dimensional data is a challenging and increasingly important task in machine learning.

# Derandomizing Knockoffs

Model-X knockoffs is a general procedure that can leverage any feature importance measure to produce a variable selection algorithm, which discovers true effects while rigorously controlling the number or fraction of false positives.