Galactica is a language model which uses a Transformer architecture in a decoder-only setup with the following modifications:
Paper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Question Answering | 5 | 13.51% |
Language Modelling | 3 | 8.11% |
Large Language Model | 2 | 5.41% |
Domain Adaptation | 2 | 5.41% |
Common Sense Reasoning | 2 | 5.41% |
Fairness | 2 | 5.41% |
Document Classification | 1 | 2.70% |
General Knowledge | 1 | 2.70% |
Philosophy | 1 | 2.70% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |