2008
DOI: 10.1007/978-3-540-70807-0_8
|View full text |Cite
|
Sign up to set email alerts
|

A Memetic-Neural Approach to Discover Resources in P2P Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2011
2011
2014
2014

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 12 publications
(4 citation statements)
references
References 0 publications
0
4
0
Order By: Relevance
“…With respect to employing multiple optimizers, Neri et al [28] studied the problem of optimizing the topology of an MLP neural network, and optimized its weights using an EA coupled with a simulated annealing (SA) and a Hook-Jeeves optimizers. Caponio et al [29] coupled an EA with a Hooke-Jeeves and a simplex optimizer for the problem of optimizing the performance of a motor.…”
Section: Related Studies: Algorithms Employing Multiple Metamodels Ormentioning
confidence: 99%
“…With respect to employing multiple optimizers, Neri et al [28] studied the problem of optimizing the topology of an MLP neural network, and optimized its weights using an EA coupled with a simulated annealing (SA) and a Hook-Jeeves optimizers. Caponio et al [29] coupled an EA with a Hooke-Jeeves and a simplex optimizer for the problem of optimizing the performance of a motor.…”
Section: Related Studies: Algorithms Employing Multiple Metamodels Ormentioning
confidence: 99%
“…The choice of these parameters is an optimization problem, see 23 , 24 , 25 , 26 , and 27 . Since this optimization problem is continuous, usually multivariate, and implicitly noisy, see 28 , algorithms based on Particle Swarm Optimization (PSO) and Differential Evolution (DE) frameworks have been proposed in the literature in several occasions. Some examples of PSO application neural network training is given in 29 and 30 .…”
Section: Introductionmentioning
confidence: 99%
“…ventured into stochastic refinement to enhancing search diversity in the neighborhood. In combinatorial optimization domain, some of these include the tabu search for finding low autocorrelation binary sequences [53], simulated annealing to discover the optimal resources in peer-to-peer networks [134], etc. While many of these refinement methods have been designed for single-objective optimization, others specifically for multi-objective optimization have also emerged [75,204,92,83].…”
Section: Types Of Search Methodsmentioning
confidence: 99%
“…nation of multiple refinement procedures, [23] monitored how close the average fitness is compared to the population elite. Others, on the other hand, chose to effect the search adaptation according to the sparseness of individuals [214,134] or the super-fit individual [25] in the population. To prevent loss in diversity, additional populations using completely different fitness criteria have also been considered [161].…”
Section: Diversity Managementmentioning
confidence: 99%