Language Models

BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total).

Source: BLOOM: A 176B-Parameter Open-Access Multilingual Language Model

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Language Modelling 15 10.27%
Question Answering 8 5.48%
Machine Translation 7 4.79%
Text Generation 6 4.11%
Translation 6 4.11%
Quantization 5 3.42%
Large Language Model 4 2.74%
Retrieval 3 2.05%
Decoder 3 2.05%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories