In information theory, the information projection or I-projection of a probability distribution q onto a set of distributions P is where is the Kullback–Leibler divergence from q to p. Viewing the Kullback–Leibler divergence as a measure of distance, the I-projection is the "closest" distribution to q of all the distributions in P. The I-projection is useful in setting up information geometry, notably because of the following inequality, valid when P is convex: The reverse I-projection also known as moment projection or M-projection is
Attributes | Values |
---|
rdfs:label
| - Information projection (en)
|
rdfs:comment
| - In information theory, the information projection or I-projection of a probability distribution q onto a set of distributions P is where is the Kullback–Leibler divergence from q to p. Viewing the Kullback–Leibler divergence as a measure of distance, the I-projection is the "closest" distribution to q of all the distributions in P. The I-projection is useful in setting up information geometry, notably because of the following inequality, valid when P is convex: The reverse I-projection also known as moment projection or M-projection is (en)
|
dcterms:subject
| |
Wikipage page ID
| |
Wikipage revision ID
| |
Link from a Wikipage to another Wikipage
| |
Link from a Wikipage to an external page
| |
sameAs
| |
dbp:wikiPageUsesTemplate
| |
has abstract
| - In information theory, the information projection or I-projection of a probability distribution q onto a set of distributions P is where is the Kullback–Leibler divergence from q to p. Viewing the Kullback–Leibler divergence as a measure of distance, the I-projection is the "closest" distribution to q of all the distributions in P. The I-projection is useful in setting up information geometry, notably because of the following inequality, valid when P is convex: This inequality can be interpreted as an information-geometric version of Pythagoras' triangle inequality theorem, where KL divergence is viewed as squared distance in a Euclidean space. It is worthwhile to note that since and continuous in p, if P is closed and non-empty, then there exists at least one minimizer to the optimization problem framed above. Furthermore, if P is convex, then the optimum distribution is unique. The reverse I-projection also known as moment projection or M-projection is Since the KL divergence is not symmetric in its arguments, the I-projection and the M-projection will exhibit different behavior. For I-projection, will typicallyunder-estimate the support of and will lock onto one of its modes. This is due to , whenever to make sure KL divergence stays finite. For M-projection, will typically over-estimate the support of . This is due to whenever to make sure KL divergence stays finite. The concept of information projection can be extended to arbitrary statistical f-divergences and other divergences. (en)
|
prov:wasDerivedFrom
| |
page length (characters) of wiki page
| |
foaf:isPrimaryTopicOf
| |
is Link from a Wikipage to another Wikipage
of | |
is Wikipage redirect
of | |
is foaf:primaryTopic
of | |