MICAI 2007: Advances in Artificial Intelligence
DOI: 10.1007/978-3-540-76631-5_16
|View full text |Cite
|
Sign up to set email alerts
|

Temperature Cycling on Simulated Annealing for Neural Network Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
25
0

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 23 publications
(25 citation statements)
references
References 5 publications
0
25
0
Order By: Relevance
“…Therefore, to be successful, the optimization process requires a starting point obtained from a global search. A robust training process needs both the initialization and optimization processes [38]. A schematic of the hybrid ANN algorithm is presented in Fig.…”
Section: Hybrid Simulated Annealing-artificial Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…Therefore, to be successful, the optimization process requires a starting point obtained from a global search. A robust training process needs both the initialization and optimization processes [38]. A schematic of the hybrid ANN algorithm is presented in Fig.…”
Section: Hybrid Simulated Annealing-artificial Neural Networkmentioning
confidence: 99%
“…Although ANNs typically build ''black box'' models, explicit formulas can be derived for a trained ANN model. A derivative-free optimization algorithm should be added to the training process of the ANN algorithm to avoid local minima, which lead to false convergence of the ANN model [38]. Some researchers have already combined ANN and global optimization algorithms to improve ANN efficiency (e.g., [41,46].…”
Section: Introductionmentioning
confidence: 99%
“…On temperature cycling the temperature goes down and up cyclically refining the quality of the solution at each cycle. As it was indicated in (Ledesma et al, 2007), temperature cycling is beneficial for training of auto associative neural networks. Additionally, it has been pointed out (Reed & Marks, 1999) that a temperature reduction schedule inversely proportional to the logarithm of time will guarantee converge (in probability) to a global minimum, however, in practice this schedule takes too long, and it is often more efficient to repeat the algorithm a number of times using a faster schedule.…”
Section: Temperature Goes Down To Quicklymentioning
confidence: 99%
“…For artificial neural network training the implementation of simulated annealing requires a good random number generator, see (Press et al, 2002) to see code to implement such type of generators. The authors have suggested simulated annealing using temperature cycling for neural network training (Ledesma et al, 2007). The free software Neural Lab is a powerful tool to simulate artificial neural networks, and it can be downloaded from http:/ / www.fimee.ugto.mx/ profesores/ sledesma/ .…”
Section: Artificial Neural Network Trainingmentioning
confidence: 99%
“…As the temperature decreases, each ANN has the chance to improve its skills. If the ANNs are required to incorporate new skills, temperature cycling can be used, see [6]. Specifically, an ANN may learn by a combination of SA and some sort of hands-on experience.…”
Section: Simulated Annealing Evolutionmentioning
confidence: 99%