dbo:abstract
|
- BigScience Large Open-science Open-access Multilingual Language Model (BLOOM) is a transformer-based language model. It was created by over 1000 AI researchers to provide a free large language model for everyone who wants to try. Trained on around 176 billion parameters over March through July 2022, it is considered an alternative to OpenAI's GPT-3 trained on 176 billion parameters. BLOOM uses a decoder-only transformer model architecture modified from Megatron-LM GPT-2. The BLOOM project was started by a co-founder of Hugging Face. Six main groups of people were involved, including HuggingFace's BigScience team, the Microsoft DeepSpeed team, the NVIDIA Megatro-LM team, the IDRIS/GENCI team, the PyTorch team, and the volunteers in the BigScience Engineering workgroup. (en)
|
dbo:wikiPageID
| |
dbo:wikiPageLength
|
- 1419 (xsd:nonNegativeInteger)
|
dbo:wikiPageRevisionID
| |
dbo:wikiPageWikiLink
| |
dbp:wikiPageUsesTemplate
| |
dcterms:subject
| |
rdfs:comment
|
- BigScience Large Open-science Open-access Multilingual Language Model (BLOOM) is a transformer-based language model. It was created by over 1000 AI researchers to provide a free large language model for everyone who wants to try. Trained on around 176 billion parameters over March through July 2022, it is considered an alternative to OpenAI's GPT-3 trained on 176 billion parameters. BLOOM uses a decoder-only transformer model architecture modified from Megatron-LM GPT-2. (en)
|
rdfs:label
|
- BLOOM (language model) (en)
|
owl:sameAs
| |
prov:wasDerivedFrom
| |
foaf:isPrimaryTopicOf
| |
is dbo:wikiPageWikiLink
of | |
is foaf:primaryTopic
of | |