Search Results for author: Johan Larsson

Found 5 papers, 4 papers with code

Coordinate Descent for SLOPE

1 code implementation26 Oct 2022 Johan Larsson, Quentin Klopfenstein, Mathurin Massias, Jonas Wallin

The lasso is the most famous sparse regression and feature selection method.

feature selection

The Strong Screening Rule for SLOPE

1 code implementation NeurIPS 2020 Johan Larsson, Małgorzata Bogdan, Jonas Wallin

We develop a screening rule for SLOPE by examining its subdifferential and show that this rule is a generalization of the strong rule for the lasso.

The Hessian Screening Rule

1 code implementation27 Apr 2021 Johan Larsson, Jonas Wallin

Predictor screening rules, which discard predictors before fitting a model, have had considerable impact on the speed with which sparse regression problems, such as the lasso, can be solved.

regression

Look-Ahead Screening Rules for the Lasso

no code implementations12 May 2021 Johan Larsson

The lasso is a popular method to induce shrinkage and sparsity in the solution vector (coefficients) of regression problems, particularly when there are many predictors relative to the number of observations.

Cannot find the paper you are looking for? You can Submit a new open access paper.