About: Entropic vector     Goto   Sponge   NotDistinct   Permalink

An Entity of Type : owl:Thing, within Data Space : dbpedia.org associated with source document(s)
QRcode icon
http://dbpedia.org/describe/?url=http%3A%2F%2Fdbpedia.org%2Fresource%2FEntropic_vector&graph=http%3A%2F%2Fdbpedia.org&graph=http%3A%2F%2Fdbpedia.org

The entropic vector or entropic function is a concept arising in information theory. It represents the possible values of Shannon's information entropy that subsets of one set of random variables may take. Understanding which vectors are entropic is a way to represent all possible inequalities between entropies of various subsets. For example, for any two random variables , their joint entropy (the entropy of the random variable representing the pair ) is at most the sum of the entropies of and of :

AttributesValues
rdfs:label
  • Entropic vector (en)
rdfs:comment
  • The entropic vector or entropic function is a concept arising in information theory. It represents the possible values of Shannon's information entropy that subsets of one set of random variables may take. Understanding which vectors are entropic is a way to represent all possible inequalities between entropies of various subsets. For example, for any two random variables , their joint entropy (the entropy of the random variable representing the pair ) is at most the sum of the entropies of and of : (en)
dcterms:subject
Wikipage page ID
Wikipage revision ID
Link from a Wikipage to another Wikipage
sameAs
dbp:wikiPageUsesTemplate
has abstract
  • The entropic vector or entropic function is a concept arising in information theory. It represents the possible values of Shannon's information entropy that subsets of one set of random variables may take. Understanding which vectors are entropic is a way to represent all possible inequalities between entropies of various subsets. For example, for any two random variables , their joint entropy (the entropy of the random variable representing the pair ) is at most the sum of the entropies of and of : Other information-theoretic measures such as conditional information, mutual information, or total correlation can be expressed in terms of joint entropy and are thus related by the corresponding inequalities.Many inequalities satisfied by entropic vectors can be derived as linear combinations of a few basic ones, called Shannon-type inequalities.However, it has been proven that already for variables, no finite set of linear inequalities is sufficient to characterize all entropic vectors. (en)
gold:hypernym
prov:wasDerivedFrom
page length (characters) of wiki page
foaf:isPrimaryTopicOf
is Link from a Wikipage to another Wikipage of
is Wikipage redirect of
is foaf:primaryTopic of
Faceted Search & Find service v1.17_git139 as of Feb 29 2024


Alternative Linked Data Documents: ODE     Content Formats:   [cxml] [csv]     RDF   [text] [turtle] [ld+json] [rdf+json] [rdf+xml]     ODATA   [atom+xml] [odata+json]     Microdata   [microdata+json] [html]    About   
This material is Open Knowledge   W3C Semantic Web Technology [RDF Data] Valid XHTML + RDFa
OpenLink Virtuoso version 08.03.3330 as of Mar 19 2024, on Linux (x86_64-generic-linux-glibc212), Single-Server Edition (378 GB total memory, 54 GB memory in use)
Data on this page belongs to its respective rights holders.
Virtuoso Faceted Browser Copyright © 2009-2024 OpenLink Software