2014
DOI: 10.1016/j.aej.2014.09.007
|View full text |Cite
|
Sign up to set email alerts
|

A novel Self-Organizing Map (SOM) learning algorithm with nearest and farthest neurons

Abstract: The Self-Organizing Map (SOM) has applications like dimension reduction, data clustering, image analysis, and many others. In conventional SOM, the weights of the winner and its neighboring neurons are updated regardless of their distance from the input vector. In the proposed SOM, the farthest and nearest neurons from among the 1-neighborhood of the winner neuron, and also the winning frequency of each neuron are found out and taken into account while updating the weight. This new SOM is applied to various in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0
1

Year Published

2016
2016
2021
2021

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 46 publications
(19 citation statements)
references
References 20 publications
0
18
0
1
Order By: Relevance
“…The SOM algorithm classifies data into the nearest neurons by calculating the distance between the data input of the input layer and the neurons of the output layer [28]. First, the Euclidean distance between the input data and the weight of output neuron is achieved to determine the nearest neuron as the winner neuron W, calculated as follows [29]:…”
Section: Clustering Methodologymentioning
confidence: 99%
See 2 more Smart Citations
“…The SOM algorithm classifies data into the nearest neurons by calculating the distance between the data input of the input layer and the neurons of the output layer [28]. First, the Euclidean distance between the input data and the weight of output neuron is achieved to determine the nearest neuron as the winner neuron W, calculated as follows [29]:…”
Section: Clustering Methodologymentioning
confidence: 99%
“…where ω ij (t) is the output neuron vector at the number of iterations t, and x(t) is the input data. The determined winner neuron W is updated by the following steps [29]:…”
Section: Clustering Methodologymentioning
confidence: 99%
See 1 more Smart Citation
“…For example, only winner neurons weights are updated in competitive neural processing (Abdipoor et al 2013). Likewise, winner neurons and specified topological neighbour neurons weights are updated in Kohonen learning or maps (Chaudhary et al 2014;Hasan and Shamsuddin 2011). Hence, it is possible to deduce which neurons respond optimally to which patterns or stimuli.…”
Section: Investigated Hypothesesmentioning
confidence: 99%
“…In one of these studies, Chaudhary et al (2014) modified the classical SOM in a way that as well as the farthest and nearest neurons from the winner neuron, the winning frequency of each neuron was taken into account for updating the weight [3]. In another study, Ghaseminezhad and Karami (2011) presented a novel SOM-based algorithm for clustering discrete groups of data and they indicated the classic SOM algorithm could not cluster discrete data correctly [4].…”
Section: Introductionmentioning
confidence: 99%