About: DeepSpeed

An Entity of Type: software, from Named Graph: http://dbpedia.org, within Data Space: dbpedia.org

DeepSpeed is an open source deep learning optimization library for PyTorch. The library is designed to reduce computing power and memory use and to train large distributed models with better parallelism on existing computer hardware. DeepSpeed is optimized for low latency, high throughput training. It includes the Zero Redundancy Optimizer (ZeRO) for training models with 1 trillion or more parameters. Features include mixed precision training, single-GPU, multi-GPU, and multi-node training as well as custom model parallelism. The DeepSpeed source code is licensed under MIT License and available on GitHub.

Property Value
dbo:abstract
  • DeepSpeed is an open source deep learning optimization library for PyTorch. The library is designed to reduce computing power and memory use and to train large distributed models with better parallelism on existing computer hardware. DeepSpeed is optimized for low latency, high throughput training. It includes the Zero Redundancy Optimizer (ZeRO) for training models with 1 trillion or more parameters. Features include mixed precision training, single-GPU, multi-GPU, and multi-node training as well as custom model parallelism. The DeepSpeed source code is licensed under MIT License and available on GitHub. The team claimed to achieve up to a 6.2x throughput improvement, 2.8x faster convergence, and 4.6x less communication. (en)
dbo:author
dbo:developer
dbo:genre
dbo:latestReleaseDate
  • 2022-08-01 (xsd:date)
dbo:latestReleaseVersion
  • v0.7.0
dbo:license
dbo:programmingLanguage
dbo:releaseDate
  • 2020-05-18 (xsd:date)
dbo:thumbnail
dbo:wikiPageExternalLink
dbo:wikiPageID
  • 64396232 (xsd:integer)
dbo:wikiPageLength
  • 4228 (xsd:nonNegativeInteger)
dbo:wikiPageRevisionID
  • 1109254514 (xsd:integer)
dbo:wikiPageWikiLink
dbp:author
dbp:developer
dbp:genre
dbp:latestReleaseDate
  • 2022-08-01 (xsd:date)
dbp:latestReleaseVersion
  • v0.7.0 (en)
dbp:license
dbp:logo
  • DeepSpeed logo.svg (en)
dbp:name
  • DeepSpeed (en)
dbp:programmingLanguage
dbp:released
  • 2020-05-18 (xsd:date)
dbp:repo
dbp:wikiPageUsesTemplate
dct:subject
rdf:type
rdfs:comment
  • DeepSpeed is an open source deep learning optimization library for PyTorch. The library is designed to reduce computing power and memory use and to train large distributed models with better parallelism on existing computer hardware. DeepSpeed is optimized for low latency, high throughput training. It includes the Zero Redundancy Optimizer (ZeRO) for training models with 1 trillion or more parameters. Features include mixed precision training, single-GPU, multi-GPU, and multi-node training as well as custom model parallelism. The DeepSpeed source code is licensed under MIT License and available on GitHub. (en)
rdfs:label
  • DeepSpeed (en)
owl:sameAs
prov:wasDerivedFrom
foaf:depiction
foaf:isPrimaryTopicOf
foaf:name
  • DeepSpeed (en)
is dbo:wikiPageWikiLink of
is foaf:primaryTopic of
Powered by OpenLink Virtuoso    This material is Open Knowledge     W3C Semantic Web Technology     This material is Open Knowledge    Valid XHTML + RDFa
This content was extracted from Wikipedia and is licensed under the Creative Commons Attribution-ShareAlike 3.0 Unported License