Gradient Checkpointing is a method used for reducing the memory footprint when training deep neural networks, at the cost of having a small increase in computation time.
Source: Training Deep Nets with Sublinear Memory CostPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Image Classification | 2 | 11.76% |
Mamba | 1 | 5.88% |
Video Understanding | 1 | 5.88% |
Computational Efficiency | 1 | 5.88% |
Music Generation | 1 | 5.88% |
Diversity | 1 | 5.88% |
Image Captioning | 1 | 5.88% |
Machine Translation | 1 | 5.88% |
MRI Reconstruction | 1 | 5.88% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |