Mixture of experts (MoE) refers to a machine learning technique where multiple expert networks (learners) are used to divide a problem space into homogeneous regions. It differs from ensemble techniques in that typically only a few, or 1, expert model will be run, rather than combining results from all models. An example from computer vision is combining one neural network model for human detection with another for pose estimation.
Property | Value |
---|---|
dbo:abstract |
|
dbo:wikiPageID |
|
dbo:wikiPageLength |
|
dbo:wikiPageRevisionID |
|
dbo:wikiPageWikiLink | |
dbp:wikiPageUsesTemplate | |
dcterms:subject | |
rdfs:comment |
|
rdfs:label |
|
owl:sameAs | |
prov:wasDerivedFrom | |
foaf:isPrimaryTopicOf | |
is dbo:wikiPageDisambiguates of | |
is dbo:wikiPageRedirects of | |
is dbo:wikiPageWikiLink of | |
is foaf:primaryTopic of |