In mathematical optimization, the proximal operator is an operator associated with a proper, lower semi-continuous convex function from a Hilbert space to , and is defined by: For any function in this class, the minimizer of the right-hand side above is unique, hence making the proximal operator well-defined. The of a function enjoys several useful properties for optimization, enumerated below. Note that all of these items require to be proper (i.e. not identically , and never take a value of ), convex, and lower semi-continuous.
Attributes | Values |
---|
rdfs:label
| |
rdfs:comment
| - In mathematical optimization, the proximal operator is an operator associated with a proper, lower semi-continuous convex function from a Hilbert space to , and is defined by: For any function in this class, the minimizer of the right-hand side above is unique, hence making the proximal operator well-defined. The of a function enjoys several useful properties for optimization, enumerated below. Note that all of these items require to be proper (i.e. not identically , and never take a value of ), convex, and lower semi-continuous. (en)
|
dcterms:subject
| |
Wikipage page ID
| |
Wikipage revision ID
| |
Link from a Wikipage to another Wikipage
| |
Link from a Wikipage to an external page
| |
sameAs
| |
dbp:wikiPageUsesTemplate
| |
has abstract
| - In mathematical optimization, the proximal operator is an operator associated with a proper, lower semi-continuous convex function from a Hilbert space to , and is defined by: For any function in this class, the minimizer of the right-hand side above is unique, hence making the proximal operator well-defined. The of a function enjoys several useful properties for optimization, enumerated below. Note that all of these items require to be proper (i.e. not identically , and never take a value of ), convex, and lower semi-continuous. A function is said to be firmly non-expansive if . Fixed points of are minimizers of : . Global convergence to a minimizer is defined as follows: If , then for any initial point , the recursion yields convergence as . This convergence may be weak if is infinite dimensional. It is frequently used in optimization algorithms associated with non-differentiable optimization problems such as total variation denoising. If is the 0- indicator function of a nonempty, closed, convex set, then it is lower semi-continuous, proper, and convex and is the orthogonal projector onto that set. (en)
|
prov:wasDerivedFrom
| |
page length (characters) of wiki page
| |
foaf:isPrimaryTopicOf
| |
is Link from a Wikipage to another Wikipage
of | |
is Wikipage redirect
of | |
is foaf:primaryTopic
of | |