Sparse Learning

28 papers with code • 3 benchmarks • 3 datasets

This task has no description! Would you like to contribute one?

Greatest papers with code

The State of Sparsity in Deep Neural Networks

google-research/google-research 25 Feb 2019

We rigorously evaluate three state-of-the-art techniques for inducing sparsity in deep neural networks on two large-scale learning tasks: Transformer trained on WMT 2014 English-to-German, and ResNet-50 trained on ImageNet.

Model Compression Sparse Learning

Feature Selection: A Data Perspective

jundongl/scikit-feature 29 Jan 2016

To facilitate and promote the research in this community, we also present an open-source feature selection repository that consists of most of the popular feature selection algorithms (\url{http://featureselection. asu. edu/}).

Feature Selection Sparse Learning

Sparse Networks from Scratch: Faster Training without Losing Performance

TimDettmers/sparse_learning ICLR 2020

We demonstrate the possibility of what we call sparse learning: accelerated training of deep neural networks that maintain sparse weights throughout training while achieving dense performance levels.

Image Classification Sparse Learning

Variational Dropout Sparsifies Deep Neural Networks

senya-ashukha/variational-dropout-sparsifies-dnn ICML 2017

We explore a recently proposed Variational Dropout technique that provided an elegant Bayesian interpretation to Gaussian Dropout.

Sparse Learning

Rigging the Lottery: Making All Tickets Winners

google-research/rigl ICML 2020

There is a large body of work on training dense networks to yield sparse networks for inference, but this limits the size of the largest trainable sparse model to that of the largest trainable dense model.

Image Classification Language Modelling +1

A General Iterative Shrinkage and Thresholding Algorithm for Non-convex Regularized Optimization Problems

icc2115/Neural-GC 18 Mar 2013

A commonly used approach is the Multi-Stage (MS) convex relaxation (or DC programming), which relaxes the original non-convex problem to a sequence of convex problems.

Sparse Learning

Picasso: A Sparse Learning Library for High Dimensional Data Analysis in R and Python

jasonge27/picasso 27 Jun 2020

We describe a new library named picasso, which implements a unified framework of pathwise coordinate optimization for a variety of sparse learning problems (e. g., sparse linear regression, sparse logistic regression, sparse Poisson regression and scaled sparse linear regression) combined with efficient active set selection strategies.

Sparse Learning

Fast Best Subset Selection: Coordinate Descent and Local Combinatorial Optimization Algorithms

hazimehh/L0Learn 5 Mar 2018

In spite of the usefulness of $L_0$-based estimators and generic MIO solvers, there is a steep computational price to pay when compared to popular sparse learning algorithms (e. g., based on $L_1$ regularization).

Combinatorial Optimization Feature Selection +2

The Tradeoff Between Privacy and Accuracy in Anomaly Detection Using Federated XGBoost

Raymw/Federated-XGBoost 16 Jul 2019

Our proposed federated XGBoost algorithm incorporates data aggregation and sparse federated update processes to balance the tradeoff between privacy and learning performance.

Anomaly Detection Federated Learning +1

Event Enhanced High-Quality Image Recovery

ShinyWang33/eSL-Net ECCV 2020

To recover high-quality intensity images, one should address both denoising and super-resolution problems for event cameras.

Denoising Sparse Learning +1