Feature Compression

12 papers with code • 0 benchmarks • 0 datasets

Compress data for machine interpretability to perform downstream tasks, rather than for human perception.

Most implemented papers

Supervised Compression for Resource-Constrained Edge Computing Systems

yoshitomo-matsubara/supervised-compression 21 Aug 2021

There has been much interest in deploying deep learning algorithms on low-powered devices, including smartphones, drones, and medical sensors.

Context-aware Deep Feature Compression for High-speed Visual Tracking

jongwon20000/TRACA CVPR 2018

We propose a new context-aware correlation filter based tracking framework to achieve both high computational speed and state-of-the-art performance among real-time trackers.

BottleNet++: An End-to-End Approach for Feature Compression in Device-Edge Co-Inference Systems

shaojiawei07/BottleNetPlusPlus 31 Oct 2019

By exploiting the strong sparsity and the fault-tolerant property of the intermediate feature in a deep neural network (DNN), BottleNet++ achieves a much higher compression ratio than existing methods.

Lossy Compression for Lossless Prediction

YannDubs/lossyless NeurIPS 2021

Most data is automatically collected and only ever "seen" by algorithms.

Context-Aware Compilation of DNN Training Pipelines across Edge and Cloud

dixiyao/Context-Aware-Compilation-of-DNN-Training-Pipelines-across-Edge-and-Cloud Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 2021

Experimental results show that our system not only adapts well to, but also draws on the varying contexts, delivering a practical and efficient solution to edge-cloud model training.

SC2: Supervised Compression for Split Computing

yoshitomo-matsubara/sc2-benchmark 16 Mar 2022

Split computing distributes the execution of a neural network (e. g., for a classification task) between a mobile device and a more powerful edge server.

Multi-Agent Collaborative Inference via DNN Decoupling: Intermediate Feature Compression and Edge Learning

Hao840/MAHPPO 24 May 2022

In this paper, we study the multi-agent collaborative inference scenario, where a single edge server coordinates the inference of multiple UEs.

Compressing Features for Learning with Noisy Labels

yingyichen-cyy/Nested-Co-teaching 27 Jun 2022

This decomposition provides three insights: (i) it shows that over-fitting is indeed an issue for learning with noisy labels; (ii) through an information bottleneck formulation, it explains why the proposed feature compression helps in combating label noise; (iii) it gives explanations on the performance boost brought by incorporating compression regularization into Co-teaching.

Supervised Feature Compression based on Counterfactual Analysis

ceciliasalvatore/sfcca 17 Nov 2022

Counterfactual Explanations are becoming a de-facto standard in post-hoc interpretable machine learning.

Efficient Feature Compression for Edge-Cloud Systems

duanzhiihao/edge-cloud-rac 17 Nov 2022

Optimizing computation in an edge-cloud system is an important yet challenging problem.