This HTML5 document contains 62 embedded RDF statements represented using HTML+Microdata notation.

The embedded RDF content will be recognized by any processor of HTML5 Microdata.

Namespace Prefixes

PrefixIRI
dctermshttp://purl.org/dc/terms/
dbohttp://dbpedia.org/ontology/
foafhttp://xmlns.com/foaf/0.1/
dbpedia-eshttp://es.dbpedia.org/resource/
n15https://global.dbpedia.org/id/
dbthttp://dbpedia.org/resource/Template:
dbpedia-ukhttp://uk.dbpedia.org/resource/
rdfshttp://www.w3.org/2000/01/rdf-schema#
rdfhttp://www.w3.org/1999/02/22-rdf-syntax-ns#
owlhttp://www.w3.org/2002/07/owl#
wikipedia-enhttp://en.wikipedia.org/wiki/
dbphttp://dbpedia.org/property/
dbchttp://dbpedia.org/resource/Category:
provhttp://www.w3.org/ns/prov#
xsdhhttp://www.w3.org/2001/XMLSchema#
wikidatahttp://www.wikidata.org/entity/
dbrhttp://dbpedia.org/resource/

Statements

Subject Item
dbr:List_of_mathematical_abbreviations
dbo:wikiPageWikiLink
dbr:Swish_function
Subject Item
dbr:Sigmoid-weighted_Linear_Unit
dbo:wikiPageWikiLink
dbr:Swish_function
dbo:wikiPageRedirects
dbr:Swish_function
Subject Item
dbr:Sigmoid-weighted_linear_unit
dbo:wikiPageWikiLink
dbr:Swish_function
dbo:wikiPageRedirects
dbr:Swish_function
Subject Item
dbr:Swish-beta
dbo:wikiPageWikiLink
dbr:Swish_function
dbo:wikiPageRedirects
dbr:Swish_function
Subject Item
dbr:Swish-beta_function
dbo:wikiPageWikiLink
dbr:Swish_function
dbo:wikiPageRedirects
dbr:Swish_function
Subject Item
dbr:Swish-β
dbo:wikiPageWikiLink
dbr:Swish_function
dbo:wikiPageRedirects
dbr:Swish_function
Subject Item
dbr:Swish-β_function
dbo:wikiPageWikiLink
dbr:Swish_function
dbo:wikiPageRedirects
dbr:Swish_function
Subject Item
dbr:Swish_(function)
dbo:wikiPageWikiLink
dbr:Swish_function
dbo:wikiPageRedirects
dbr:Swish_function
Subject Item
dbr:Swish
dbo:wikiPageWikiLink
dbr:Swish_function
dbo:wikiPageDisambiguates
dbr:Swish_function
Subject Item
dbr:Swish_function
rdfs:label
Función Swish Swish function Swish функція
rdfs:comment
The swish function is a mathematical function defined as follows: where β is either constant or a trainable parameter depending on the model. For β = 1, the function becomes equivalent to the Sigmoid Linear Unit or SiLU, first proposed alongside the GELU in 2016. The SiLU was later rediscovered in 2017 as the Sigmoid-weighted Linear Unit (SiL) function used in reinforcement learning. The SiLU/SiL was then rediscovered as the swish over a year after its initial discovery, originally proposed without the learnable parameter β, so that β implicitly equalled 1. The swish paper was then updated to propose the activation with the learnable parameter β, though researchers usually let β = 1 and do not use the learnable parameter β. For β = 0, the function turns into the scaled linear function f(x) Swish функція це математична функція, що описується виразом: де β є константою або параметром, який залежить від типу моделі. Похідна функції. La función swish es una función matemática definida por la siguiente fórmula: Donde β puede ser constante o un parámetro entrenable según el modelo. En el caso en que β=1, la función es equivalente a la función con ponderación sigmoide que se usa en aprendizaje de refuerzo (Sigmoid-weighted Linear Unit, SiL), mientras que para β=0, swish se convierte en la función lineal f(x)=x/2. Con β→∞, el componente sigmoideo se acerca a una función escalón unitario, por lo que swish tiende a la función ReLU. Así, puede ser vista como una interpolación no lineal entre una función lineal y la ReLU.
dcterms:subject
dbc:Artificial_neural_networks dbc:Functions_and_mappings
dbo:wikiPageID
63822450
dbo:wikiPageRevisionID
1124205518
dbo:wikiPageWikiLink
dbr:Backpropagation dbr:Trainable_parameter dbr:Activation_function dbr:Artificial_neural_network dbr:Vanishing_gradient_problem dbr:Function_(mathematics) dbr:Google dbr:Interpolate dbc:Artificial_neural_networks dbr:ImageNet dbr:Reinforcement_learning dbr:Rectifier_(neural_networks) dbr:Sigmoid_function dbc:Functions_and_mappings dbr:ReLU dbr:Mish_(function)
owl:sameAs
wikidata:Q97358426 dbpedia-es:Función_Swish n15:CjAXA dbpedia-uk:Swish_функція
dbp:wikiPageUsesTemplate
dbt:Reflist dbt:Use_dmy_dates dbt:Short_description
dbp:cs1Dates
y
dbp:date
June 2020
dbo:abstract
La función swish es una función matemática definida por la siguiente fórmula: Donde β puede ser constante o un parámetro entrenable según el modelo. En el caso en que β=1, la función es equivalente a la función con ponderación sigmoide que se usa en aprendizaje de refuerzo (Sigmoid-weighted Linear Unit, SiL), mientras que para β=0, swish se convierte en la función lineal f(x)=x/2. Con β→∞, el componente sigmoideo se acerca a una función escalón unitario, por lo que swish tiende a la función ReLU. Así, puede ser vista como una interpolación no lineal entre una función lineal y la ReLU. Swish функція це математична функція, що описується виразом: де β є константою або параметром, який залежить від типу моделі. Похідна функції. The swish function is a mathematical function defined as follows: where β is either constant or a trainable parameter depending on the model. For β = 1, the function becomes equivalent to the Sigmoid Linear Unit or SiLU, first proposed alongside the GELU in 2016. The SiLU was later rediscovered in 2017 as the Sigmoid-weighted Linear Unit (SiL) function used in reinforcement learning. The SiLU/SiL was then rediscovered as the swish over a year after its initial discovery, originally proposed without the learnable parameter β, so that β implicitly equalled 1. The swish paper was then updated to propose the activation with the learnable parameter β, though researchers usually let β = 1 and do not use the learnable parameter β. For β = 0, the function turns into the scaled linear function f(x) = x/2. With β → ∞, the sigmoid component approaches a 0-1 function, so swish approaches the ReLU function. Thus, it can be viewed as a smoothing function which nonlinearly interpolates between a linear function and the ReLU function. This function uses non-monotonicity, and may have influenced the proposal of other activation functions with this property such as Mish. When considering positive values, Swish is a particular case of sigmoid shrinkage function defined in (see the doubly parameterized sigmoid shrinkage form given by Equation (3) of this reference).
prov:wasDerivedFrom
wikipedia-en:Swish_function?oldid=1124205518&ns=0
dbo:wikiPageLength
4395
foaf:isPrimaryTopicOf
wikipedia-en:Swish_function
Subject Item
dbr:Backpropagation
dbo:wikiPageWikiLink
dbr:Swish_function
Subject Item
dbr:Rectifier_(neural_networks)
dbo:wikiPageWikiLink
dbr:Swish_function
Subject Item
dbr:Sigmoid_function
dbo:wikiPageWikiLink
dbr:Swish_function
Subject Item
wikipedia-en:Swish_function
foaf:primaryTopic
dbr:Swish_function