2008
DOI: 10.2528/pier08051403
|View full text |Cite
|
Sign up to set email alerts
|

Weights Optimization of Neural Network via Improved Bco Approach

Abstract: Abstract-Feed forward neural Network (FNN) has been widely applied to many fields because of its ability to closely approximate unknown function to any degree of desired accuracy. Back Propagation (BP) is the most general learning algorithms, but is subject to local optimal convergence and poor performance even on simple problems when forecasting out of samples. Thus, we proposed an improved Bacterial Chemotaxis Optimization (BCO) approach as a possible alternative to the problematic BP algorithm, along with a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
19
0
1

Year Published

2009
2009
2019
2019

Publication Types

Select...
6
4

Relationship

4
6

Authors

Journals

citations
Cited by 71 publications
(20 citation statements)
references
References 23 publications
0
19
0
1
Order By: Relevance
“…One category is supervised classification, including support vector machine (SVM) [12] and knearest neighbors (k-NN) [13]. The other category is unsupervised classification [14], including self-organization feature map (SOFM) [12] and fuzzy c-means [15]. While all these methods achieved good results, and yet the supervised classifier performs better than unsupervised classifier in terms of classification accuracy (success classification rate).…”
Section: Introductionmentioning
confidence: 99%
“…One category is supervised classification, including support vector machine (SVM) [12] and knearest neighbors (k-NN) [13]. The other category is unsupervised classification [14], including self-organization feature map (SOFM) [12] and fuzzy c-means [15]. While all these methods achieved good results, and yet the supervised classifier performs better than unsupervised classifier in terms of classification accuracy (success classification rate).…”
Section: Introductionmentioning
confidence: 99%
“…The forward neural network (FNN) [13] was chosen as the classifier because it is a powerful tool among supervised classifiers and it can classify nonlinear separable patterns and approximate an arbitrary continuous function [14]. However, to find the optimal parameters of FNN is a difficult task because the search algorithms are easily trapped in local extrema.…”
Section: Introductionmentioning
confidence: 99%
“…Neural networks are widely used in pattern classification since they do not need any information about the probability distribution and the a priori probabilities of different classes [26]. A single-hidden-layer backpropagation neural network is adopted with sigmoid neurons in the hidden layer and a linear neuron in the output layer.…”
Section: Our Proposed Algorithm-acpsomentioning
confidence: 99%