Search Results for author: Małgorzata Bogdan

Found 2 papers, 1 papers with code

The Strong Screening Rule for SLOPE

1 code implementation NeurIPS 2020 Johan Larsson, Małgorzata Bogdan, Jonas Wallin

We develop a screening rule for SLOPE by examining its subdifferential and show that this rule is a generalization of the strong rule for the lasso.

SLOPE - Adaptive variable selection via convex optimization

no code implementations14 Jul 2014 Małgorzata Bogdan, Ewout van den Berg, Chiara Sabatti, Weijie Su, Emmanuel J. Candès

SLOPE, short for Sorted L-One Penalized Estimation, is the solution to \[\min_{b\in\mathbb{R}^p}\frac{1}{2}\Vert y-Xb\Vert _{\ell_2}^2+\lambda_1\vert b\vert _{(1)}+\lambda_2\vert b\vert_{(2)}+\cdots+\lambda_p\vert b\vert_{(p)},\] where $\lambda_1\ge\lambda_2\ge\cdots\ge\lambda_p\ge0$ and $\vert b\vert_{(1)}\ge\vert b\vert_{(2)}\ge\cdots\ge\vert b\vert_{(p)}$ are the decreasing absolute values of the entries of $b$.

Methodology

Cannot find the paper you are looking for? You can Submit a new open access paper.