An Entity of Type: Thing, from Named Graph: http://dbpedia.org, within Data Space: dbpedia.org

The entropic vector or entropic function is a concept arising in information theory. It represents the possible values of Shannon's information entropy that subsets of one set of random variables may take. Understanding which vectors are entropic is a way to represent all possible inequalities between entropies of various subsets. For example, for any two random variables , their joint entropy (the entropy of the random variable representing the pair ) is at most the sum of the entropies of and of :

Property Value
dbo:abstract
  • The entropic vector or entropic function is a concept arising in information theory. It represents the possible values of Shannon's information entropy that subsets of one set of random variables may take. Understanding which vectors are entropic is a way to represent all possible inequalities between entropies of various subsets. For example, for any two random variables , their joint entropy (the entropy of the random variable representing the pair ) is at most the sum of the entropies of and of : Other information-theoretic measures such as conditional information, mutual information, or total correlation can be expressed in terms of joint entropy and are thus related by the corresponding inequalities.Many inequalities satisfied by entropic vectors can be derived as linear combinations of a few basic ones, called Shannon-type inequalities.However, it has been proven that already for variables, no finite set of linear inequalities is sufficient to characterize all entropic vectors. (en)
dbo:wikiPageID
  • 17519721 (xsd:integer)
dbo:wikiPageLength
  • 14119 (xsd:nonNegativeInteger)
dbo:wikiPageRevisionID
  • 1103920183 (xsd:integer)
dbo:wikiPageWikiLink
dbp:wikiPageUsesTemplate
dcterms:subject
gold:hypernym
rdfs:comment
  • The entropic vector or entropic function is a concept arising in information theory. It represents the possible values of Shannon's information entropy that subsets of one set of random variables may take. Understanding which vectors are entropic is a way to represent all possible inequalities between entropies of various subsets. For example, for any two random variables , their joint entropy (the entropy of the random variable representing the pair ) is at most the sum of the entropies of and of : (en)
rdfs:label
  • Entropic vector (en)
owl:sameAs
prov:wasDerivedFrom
foaf:isPrimaryTopicOf
is dbo:wikiPageRedirects of
is dbo:wikiPageWikiLink of
is foaf:primaryTopic of
Powered by OpenLink Virtuoso    This material is Open Knowledge     W3C Semantic Web Technology     This material is Open Knowledge    Valid XHTML + RDFa
This content was extracted from Wikipedia and is licensed under the Creative Commons Attribution-ShareAlike 3.0 Unported License