This HTML5 document contains 45 embedded RDF statements represented using HTML+Microdata notation.

The embedded RDF content will be recognized by any processor of HTML5 Microdata.

Namespace Prefixes

PrefixIRI
dctermshttp://purl.org/dc/terms/
dbohttp://dbpedia.org/ontology/
foafhttp://xmlns.com/foaf/0.1/
n6https://global.dbpedia.org/id/
dbthttp://dbpedia.org/resource/Template:
rdfshttp://www.w3.org/2000/01/rdf-schema#
rdfhttp://www.w3.org/1999/02/22-rdf-syntax-ns#
owlhttp://www.w3.org/2002/07/owl#
wikipedia-enhttp://en.wikipedia.org/wiki/
dbphttp://dbpedia.org/property/
provhttp://www.w3.org/ns/prov#
dbchttp://dbpedia.org/resource/Category:
xsdhhttp://www.w3.org/2001/XMLSchema#
wikidatahttp://www.wikidata.org/entity/
dbrhttp://dbpedia.org/resource/

Statements

Subject Item
dbr:Sanger's_rule
dbo:wikiPageWikiLink
dbr:Generalized_Hebbian_algorithm
dbo:wikiPageRedirects
dbr:Generalized_Hebbian_algorithm
Subject Item
dbr:Generalized_Hebbian_Algorithm
dbo:wikiPageWikiLink
dbr:Generalized_Hebbian_algorithm
dbo:wikiPageRedirects
dbr:Generalized_Hebbian_algorithm
Subject Item
dbr:Generalized_Hebbian_algorithm
rdfs:label
Generalized Hebbian algorithm
rdfs:comment
The generalized Hebbian algorithm (GHA), also known in the literature as Sanger's rule, is a linear feedforward neural network model for unsupervised learning with applications primarily in principal components analysis. First defined in 1989, it is similar to Oja's rule in its formulation and stability, except it can be applied to networks with multiple outputs. The name originates because of the similarity between the algorithm and a hypothesis made by Donald Hebb about the way in which synaptic strengths in the brain are modified in response to experience, i.e., that changes are proportional to the correlation between the firing of pre- and post-synaptic neurons.
dcterms:subject
dbc:Hebbian_theory dbc:Artificial_neural_networks
dbo:wikiPageID
14402929
dbo:wikiPageRevisionID
1016476670
dbo:wikiPageWikiLink
dbr:Synaptic_weight dbr:Hebbian_learning dbr:Neural_network_model dbr:Factor_analysis dbr:Matrix_diagonalization dbr:Feedforward_neural_network dbr:Contrastive_Hebbian_learning dbr:Learning dbr:Donald_Hebb dbr:Learning_rate dbr:Artificial_intelligence dbc:Hebbian_theory dbr:Self-organizing_map dbr:Principal_components_analysis dbr:Backpropagation dbr:Unsupervised_learning dbc:Artificial_neural_networks dbr:Neurons dbr:Oja's_rule dbr:Gram-Schmidt_process
owl:sameAs
n6:4kU1k wikidata:Q5532437
dbp:wikiPageUsesTemplate
dbt:Short_description dbt:Reflist dbt:Math dbt:Hebbian_learning dbt:=
dbo:abstract
The generalized Hebbian algorithm (GHA), also known in the literature as Sanger's rule, is a linear feedforward neural network model for unsupervised learning with applications primarily in principal components analysis. First defined in 1989, it is similar to Oja's rule in its formulation and stability, except it can be applied to networks with multiple outputs. The name originates because of the similarity between the algorithm and a hypothesis made by Donald Hebb about the way in which synaptic strengths in the brain are modified in response to experience, i.e., that changes are proportional to the correlation between the firing of pre- and post-synaptic neurons.
prov:wasDerivedFrom
wikipedia-en:Generalized_Hebbian_algorithm?oldid=1016476670&ns=0
dbo:wikiPageLength
5155
foaf:isPrimaryTopicOf
wikipedia-en:Generalized_Hebbian_algorithm
Subject Item
dbr:Contrastive_Hebbian_learning
dbo:wikiPageWikiLink
dbr:Generalized_Hebbian_algorithm
Subject Item
dbr:Hebbian_theory
dbo:wikiPageWikiLink
dbr:Generalized_Hebbian_algorithm
Subject Item
dbr:Oja's_rule
dbo:wikiPageWikiLink
dbr:Generalized_Hebbian_algorithm
Subject Item
wikipedia-en:Generalized_Hebbian_algorithm
foaf:primaryTopic
dbr:Generalized_Hebbian_algorithm