Dynamic sparse training methods train neural networks in a sparse manner, starting with an initial sparse mask, and periodically updating the mask based on some criteria.
Source: Scalable Training of Artificial Neural Networks with Adaptive Sparse Connectivity inspired by Network SciencePaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Dialogue State Tracking | 124 | 32.72% |
Task-Oriented Dialogue Systems | 30 | 7.92% |
Language Modelling | 16 | 4.22% |
Decoder | 12 | 3.17% |
Question Answering | 9 | 2.37% |
Slot Filling | 8 | 2.11% |
In-Context Learning | 7 | 1.85% |
Reading Comprehension | 7 | 1.85% |
Response Generation | 6 | 1.58% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |