Sparse Learning
43 papers with code • 3 benchmarks • 3 datasets
Libraries
Use these libraries to find Sparse Learning models and implementationsLatest papers
Building explainable graph neural network by sparse learning for the drug-protein binding prediction
Due to the use of the chemical-substructure-based graph, it is guaranteed that any subgraphs in a drug identified by our SLGNN are chemically valid structures.
HyperSparse Neural Networks: Shifting Exploration to Exploitation through Adaptive Regularization
Sparse neural networks are a key factor in developing resource-efficient machine learning applications.
Resource Constrained Model Compression via Minimax Optimization for Spiking Neural Networks
We propose an improved end-to-end Minimax optimization method for this sparse learning problem to better balance the model performance and the computation efficiency.
Learning to Super-Resolve Blurry Images with Events
Super-Resolution from a single motion Blurred image (SRB) is a severely ill-posed problem due to the joint degradation of motion blurs and low spatial resolution.
Video-Text Retrieval by Supervised Sparse Multi-Grained Learning
While recent progress in video-text retrieval has been advanced by the exploration of better representation learning, in this paper, we present a novel multi-grained sparse learning framework, S3MA, to learn an aligned sparse space shared between the video and the text for video-text retrieval.
Lottery Aware Sparsity Hunting: Enabling Federated Learning on Resource-Limited Edge
A possible solution to this problem is to utilize off-the-shelf sparse learning algorithms at the clients to meet their resource budget.
Controlled Sparsity via Constrained Optimization or: How I Learned to Stop Tuning Penalties and Love Constraints
The performance of trained neural networks is robust to harsh levels of pruning.
AdaMix: Mixture-of-Adaptations for Parameter-efficient Model Tuning
Standard fine-tuning of large pre-trained language models (PLMs) for downstream tasks requires updating hundreds of millions to billions of parameters, and storing a large copy of the PLM weights for every task resulting in increased cost for storing, sharing and serving the models.
APP: Anytime Progressive Pruning
With the latest advances in deep learning, there has been a lot of focus on the online learning paradigm due to its relevance in practical settings.
L0Learn: A Scalable Package for Sparse Learning using L0 Regularization
We present L0Learn: an open-source package for sparse linear regression and classification using $\ell_0$ regularization.