Dynamic neural networks

15 papers with code • 0 benchmarks • 0 datasets

Dynamic neural networks are adaptable models that can change their structure or parameters during training or inference based on input complexity or computational constraints. They offer benefits like improved efficiency, adaptability, and scalability compared to static architectures.

Most implemented papers

Nonlinear Systems Identification Using Deep Dynamic Neural Networks

lakehanne/FARNN 5 Oct 2016

Neural networks are known to be effective function approximators.

AMPNet: Asynchronous Model-Parallel Training for Dynamic Neural Networks

facebookresearch/fairscale ICLR 2018

Through an implementation on multi-core CPUs, we show that AMP training converges to the same accuracy as conventional synchronous training algorithms in a similar number of epochs, but utilizes the available hardware more efficiently even for small minibatch sizes, resulting in significantly shorter overall training times.

Dynamic Dual Gating Neural Networks

lfr-0531/dgnet ICCV 2021

In particular, dynamic dual gating can provide 59. 7% saving in computing of ResNet50 with 76. 41% top-1 accuracy on ImageNet, which has advanced the state-of-the-art.

Learning Task-Oriented Communication for Edge Inference: An Information Bottleneck Approach

shaojiawei07/VL-VFE 8 Feb 2021

Extensive experiments evidence that the proposed task-oriented communication system achieves a better rate-distortion tradeoff than baseline methods and significantly reduces the feature transmission latency in dynamic channel conditions.

DYNASHARE: DYNAMIC NEURAL NETWORKS FOR MULTI-TASK LEARNING

BorealisAI/DynaShare-MTL 29 Sep 2021

Parameter sharing approaches for deep multi-task learning share a common intuition: for a single network to perform multiple prediction tasks, the network needs to support multiple specialized execution paths.

Temporal Domain Generalization with Drift-Aware Dynamic Neural Networks

baithebest/drain 21 May 2022

Temporal domain generalization is a promising yet extremely challenging area where the goal is to learn models under temporally changing data distributions and generalize to unseen data distributions following the trends of the change.

SATBench: Benchmarking the speed-accuracy tradeoff in object recognition by humans and dynamic neural networks

ajaysub110/satbench 16 Jun 2022

Using FLOPs as an analog for reaction time, we compare networks with humans on curve-fit error, category-wise correlation, and curve steepness, and conclude that cascaded dynamic neural networks are a promising model of human reaction time in object recognition tasks.

Boosted Dynamic Neural Networks

SHI-Labs/Boosted-Dynamic-Networks 30 Nov 2022

To optimize the model, these prediction heads together with the network backbone are trained on every batch of training data.

HADAS: Hardware-Aware Dynamic Neural Architecture Search for Edge Performance Scaling

halimabouzidi/hadas 6 Dec 2022

Dynamic neural networks (DyNNs) have become viable techniques to enable intelligence on resource-constrained edge devices while maintaining computational efficiency.

Fixing Overconfidence in Dynamic Neural Networks

aaltoml/calibrated-dnn 13 Feb 2023

Dynamic neural networks are a recent technique that promises a remedy for the increasing size of modern deep learning models by dynamically adapting their computational cost to the difficulty of the inputs.