Galactica is a language model which uses a Transformer architecture in a decoder-only setup with the following modifications:
Paper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Question Answering | 5 | 11.36% |
Language Modeling | 3 | 6.82% |
Language Modelling | 3 | 6.82% |
Large Language Model | 2 | 4.55% |
Domain Adaptation | 2 | 4.55% |
Common Sense Reasoning | 2 | 4.55% |
Fairness | 2 | 4.55% |
Document Classification | 1 | 2.27% |
General Knowledge | 1 | 2.27% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |