1 code implementation • 26 Oct 2022 • Johan Larsson, Quentin Klopfenstein, Mathurin Massias, Jonas Wallin
The lasso is the most famous sparse regression and feature selection method.
1 code implementation • NeurIPS 2021 • David Bolin, Jonas Wallin
Methods for inference and simulation of linearly constrained Gaussian Markov Random Fields (GMRF) are computationally prohibitive when the number of constraints is large.
1 code implementation • 27 Apr 2021 • Johan Larsson, Jonas Wallin
Predictor screening rules, which discard predictors before fitting a model, have had considerable impact on the speed with which sparse regression problems, such as the lasso, can be solved.
1 code implementation • NeurIPS 2020 • Johan Larsson, Małgorzata Bogdan, Jonas Wallin
We develop a screening rule for SLOPE by examining its subdifferential and show that this rule is a generalization of the strong rule for the lasso.
1 code implementation • 7 Apr 2018 • Özgür Asar, David Bolin, Peter J. Diggle, Jonas Wallin
We consider the analysis of continuous repeated measurement outcomes that are collected through time, also known as longitudinal data.
Methodology
no code implementations • 7 Jul 2016 • Anders Hildeman, David Bolin, Jonas Wallin, Adam Johansson, Tufve Nyholm, Thomas Asklund, Jun Yu
The amount of data needed to train a model for s-CT generation is of the order of 100 million voxels.