An Entity of Type: Thing, from Named Graph: http://dbpedia.org, within Data Space: dbpedia.org

In theoretical computer science, smoothed analysis is a way of measuring the complexity of an algorithm. Since its introduction in 2001, smoothed analysis has been used as a basis for considerable research, for problems ranging from mathematical programming, numerical analysis, machine learning, and data mining. It can give a more realistic analysis of the practical performance (e.g., running time, success rate, approximation quality) of the algorithm compared to analysis that uses worst-case or average-case scenarios.

Property Value
dbo:abstract
  • In theoretical computer science, smoothed analysis is a way of measuring the complexity of an algorithm. Since its introduction in 2001, smoothed analysis has been used as a basis for considerable research, for problems ranging from mathematical programming, numerical analysis, machine learning, and data mining. It can give a more realistic analysis of the practical performance (e.g., running time, success rate, approximation quality) of the algorithm compared to analysis that uses worst-case or average-case scenarios. Smoothed analysis is a hybrid of worst-case and average-case analyses that inherits advantages of both. It measures the expected performance of algorithms under slight random perturbations of worst-case inputs. If the smoothed complexity of an algorithm is low, then it is unlikely that the algorithm will take a long time to solve practical instances whose data are subject to slight noises and imprecisions. Smoothed complexity results are strong probabilistic results, roughly stating that, in every large enough neighbourhood of the space of inputs, most inputs are easily solvable. Thus, a low smoothed complexity means that the hardness of inputs is a "brittle" property. Although worst-case analysis has been widely successful in explaining the practical performance of many algorithms, this style of analysis gives misleading results for a number of problems. Worst-case complexity measures the time it takes to solve any input, although hard-to-solve inputs might never come up in practice. In such cases, the worst-case running time can be much worse than the observed running time in practice. For example, the worst-case complexity of solving a linear program using the simplex algorithm is exponential, although the observed number of steps in practice is roughly linear. The simplex algorithm is in fact much faster than the ellipsoid method in practice, although the latter has polynomial-time worst-case complexity. Average-case analysis was first introduced to overcome the limitations of worst-case analysis. However, the resulting average-case complexity depends heavily on the probability distribution that is chosen over the input. The actual inputs and distribution of inputs may be different in practice from the assumptions made during the analysis: a random input may be very unlike a typical input. Because of this choice of data model, a theoretical average-case result might say little about practical performance of the algorithm. Smoothed analysis generalizes both worst-case and average-case analysis and inherits strengths of both. It is intended to be much more general than average-case complexity, while still allowing low complexity bounds to be proven. (en)
  • En informatique théorique, l'analyse lisse d'algorithme (smoothed analysis) est une manière de mesurer la complexité d'un algorithme, c'est-à-dire ses performances. Elle complète et améliore les mesures classiques de complexité : la complexité dans le pire des cas, la complexité en moyenne et la complexité amortie. Elle a été inventée dans les années 2000 par Daniel Spielman et Shang-Hua Teng. (fr)
dbo:thumbnail
dbo:wikiPageID
  • 18208194 (xsd:integer)
dbo:wikiPageLength
  • 13460 (xsd:nonNegativeInteger)
dbo:wikiPageRevisionID
  • 1106909130 (xsd:integer)
dbo:wikiPageWikiLink
dbp:wikiPageUsesTemplate
dcterms:subject
gold:hypernym
rdfs:comment
  • En informatique théorique, l'analyse lisse d'algorithme (smoothed analysis) est une manière de mesurer la complexité d'un algorithme, c'est-à-dire ses performances. Elle complète et améliore les mesures classiques de complexité : la complexité dans le pire des cas, la complexité en moyenne et la complexité amortie. Elle a été inventée dans les années 2000 par Daniel Spielman et Shang-Hua Teng. (fr)
  • In theoretical computer science, smoothed analysis is a way of measuring the complexity of an algorithm. Since its introduction in 2001, smoothed analysis has been used as a basis for considerable research, for problems ranging from mathematical programming, numerical analysis, machine learning, and data mining. It can give a more realistic analysis of the practical performance (e.g., running time, success rate, approximation quality) of the algorithm compared to analysis that uses worst-case or average-case scenarios. (en)
rdfs:label
  • Analyse lisse d'algorithme (fr)
  • Smoothed analysis (en)
owl:sameAs
prov:wasDerivedFrom
foaf:depiction
foaf:isPrimaryTopicOf
is dbo:knownFor of
is dbo:wikiPageRedirects of
is dbo:wikiPageWikiLink of
is dbp:knownFor of
is foaf:primaryTopic of
Powered by OpenLink Virtuoso    This material is Open Knowledge     W3C Semantic Web Technology     This material is Open Knowledge    Valid XHTML + RDFa
This content was extracted from Wikipedia and is licensed under the Creative Commons Attribution-ShareAlike 3.0 Unported License