About: BLOOM (language model)     Goto   Sponge   NotDistinct   Permalink

An Entity of Type : owl:Thing, within Data Space : dbpedia.org associated with source document(s)
QRcode icon
http://dbpedia.org/describe/?url=http%3A%2F%2Fdbpedia.org%2Fresource%2FBLOOM_%28language_model%29

BigScience Large Open-science Open-access Multilingual Language Model (BLOOM) is a transformer-based language model. It was created by over 1000 AI researchers to provide a free large language model for everyone who wants to try. Trained on around 176 billion parameters over March through July 2022, it is considered an alternative to OpenAI's GPT-3 trained on 176 billion parameters. BLOOM uses a decoder-only transformer model architecture modified from Megatron-LM GPT-2.

AttributesValues
rdfs:label
  • BLOOM (language model) (en)
rdfs:comment
  • BigScience Large Open-science Open-access Multilingual Language Model (BLOOM) is a transformer-based language model. It was created by over 1000 AI researchers to provide a free large language model for everyone who wants to try. Trained on around 176 billion parameters over March through July 2022, it is considered an alternative to OpenAI's GPT-3 trained on 176 billion parameters. BLOOM uses a decoder-only transformer model architecture modified from Megatron-LM GPT-2. (en)
dcterms:subject
Wikipage page ID
Wikipage revision ID
Link from a Wikipage to another Wikipage
sameAs
dbp:wikiPageUsesTemplate
has abstract
  • BigScience Large Open-science Open-access Multilingual Language Model (BLOOM) is a transformer-based language model. It was created by over 1000 AI researchers to provide a free large language model for everyone who wants to try. Trained on around 176 billion parameters over March through July 2022, it is considered an alternative to OpenAI's GPT-3 trained on 176 billion parameters. BLOOM uses a decoder-only transformer model architecture modified from Megatron-LM GPT-2. The BLOOM project was started by a co-founder of Hugging Face. Six main groups of people were involved, including HuggingFace's BigScience team, the Microsoft DeepSpeed team, the NVIDIA Megatro-LM team, the IDRIS/GENCI team, the PyTorch team, and the volunteers in the BigScience Engineering workgroup. (en)
prov:wasDerivedFrom
page length (characters) of wiki page
foaf:isPrimaryTopicOf
is Link from a Wikipage to another Wikipage of
is foaf:primaryTopic of
Faceted Search & Find service v1.17_git139 as of Feb 29 2024


Alternative Linked Data Documents: ODE     Content Formats:   [cxml] [csv]     RDF   [text] [turtle] [ld+json] [rdf+json] [rdf+xml]     ODATA   [atom+xml] [odata+json]     Microdata   [microdata+json] [html]    About   
This material is Open Knowledge   W3C Semantic Web Technology [RDF Data] Valid XHTML + RDFa
OpenLink Virtuoso version 08.03.3330 as of Mar 19 2024, on Linux (x86_64-generic-linux-glibc212), Single-Server Edition (62 GB total memory, 50 GB memory in use)
Data on this page belongs to its respective rights holders.
Virtuoso Faceted Browser Copyright © 2009-2024 OpenLink Software