An Entity of Type: Thing, from Named Graph: http://dbpedia.org, within Data Space: dbpedia.org

The chain rule for Kolmogorov complexity is an analogue of the chain rule for information entropy, which states: That is, the combined randomness of two sequences X and Y is the sum of the randomness of X plus whatever randomness is left in Y once we know X.This follows immediately from the definitions of conditional and joint entropy, and the fact from probability theory that the joint probability is the product of the marginal and conditional probability: The equivalent statement for Kolmogorov complexity does not hold exactly; it is true only up to a logarithmic term:

Property Value
dbo:abstract
  • The chain rule for Kolmogorov complexity is an analogue of the chain rule for information entropy, which states: That is, the combined randomness of two sequences X and Y is the sum of the randomness of X plus whatever randomness is left in Y once we know X.This follows immediately from the definitions of conditional and joint entropy, and the fact from probability theory that the joint probability is the product of the marginal and conditional probability: The equivalent statement for Kolmogorov complexity does not hold exactly; it is true only up to a logarithmic term: (An exact version, KP(x, y) = KP(x) + KP(y|x*) + O(1),holds for the prefix complexity KP, where x* is a shortest program for x.) It states that the shortest program printing X and Y is obtained by concatenating a shortest program printing X with a program printing Y given X, plus at most a logarithmic factor. The results implies that algorithmic mutual information, an analogue of mutual information for Kolmogorov complexity is symmetric: I(x:y) = I(y:x) + O(log K(x,y)) for all x,y. (en)
dbo:wikiPageID
  • 8566056 (xsd:integer)
dbo:wikiPageLength
  • 4738 (xsd:nonNegativeInteger)
dbo:wikiPageRevisionID
  • 1123649363 (xsd:integer)
dbo:wikiPageWikiLink
dbp:wikiPageUsesTemplate
dcterms:subject
rdfs:comment
  • The chain rule for Kolmogorov complexity is an analogue of the chain rule for information entropy, which states: That is, the combined randomness of two sequences X and Y is the sum of the randomness of X plus whatever randomness is left in Y once we know X.This follows immediately from the definitions of conditional and joint entropy, and the fact from probability theory that the joint probability is the product of the marginal and conditional probability: The equivalent statement for Kolmogorov complexity does not hold exactly; it is true only up to a logarithmic term: (en)
rdfs:label
  • Chain rule for Kolmogorov complexity (en)
owl:sameAs
prov:wasDerivedFrom
foaf:isPrimaryTopicOf
is dbo:wikiPageDisambiguates of
is dbo:wikiPageRedirects of
is dbo:wikiPageWikiLink of
is foaf:primaryTopic of
Powered by OpenLink Virtuoso    This material is Open Knowledge     W3C Semantic Web Technology     This material is Open Knowledge    Valid XHTML + RDFa
This content was extracted from Wikipedia and is licensed under the Creative Commons Attribution-ShareAlike 3.0 Unported License