1988
DOI: 10.1007/bf00363956
|View full text |Cite
|
Sign up to set email alerts
|

On the stability of the Travelling Salesman Problem algorithm of Hopfield and Tank

Abstract: The application of the method of Hopfield and Tank to the Travelling Salesman Problem (1985) has been re-examined in an effort to find a procedure for scaling to system sizes of real interest. As this failed, methods were tried to improve the algorithm, recognizing the importance of a "silicon implementation". No satisfactory refinement was found, and the reasons for algorithm failure have been identified.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
106
2
1

Year Published

1999
1999
2023
2023

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 511 publications
(110 citation statements)
references
References 3 publications
1
106
2
1
Order By: Relevance
“…Mostly, two types of neural networks are applied for solving TSP. Hopfield neural network [21,22] performs weakly in solving big problems and the self-organizing map (SOM) [23] which exhibits better performance in the large-scale problems.…”
Section: Related Workmentioning
confidence: 99%
“…Mostly, two types of neural networks are applied for solving TSP. Hopfield neural network [21,22] performs weakly in solving big problems and the self-organizing map (SOM) [23] which exhibits better performance in the large-scale problems.…”
Section: Related Workmentioning
confidence: 99%
“…Bearing this in mind for mapping the TSP onto the Hopfield network, the Lyapunov function is set such that (10) The modified penalty function is given as (11) where the first term represents the constraint that each row has exactly one neuron on fire, and the second term represents the constraint that each column has exactly one neuron on fire. The scaling parameters and play the role of balancing the constraints.…”
Section: Enhanced Lyapunov Function For Mapping Tspmentioning
confidence: 99%
“…A number of approaches based upon chaotic neural networks have also been proposed to solve the TSP, and excellent results with less local minima have been obtained due to the global search capability of the chaotic networks [6]- [10]. Manuscript It has been widely recognized that the Hopfield-Tank (H-T) formulation [1] of energy function often causes infeasible solutions in solving the TSP [11]. The inter-relationship among the parameters suggests that the H-T formulation for TSP does not have a good scaling property and only a small range of parameter combinations will result in valid and stable solutions, as indicated by the small percentage of valid tours in approaches based on the H-T formulation [12], [13].…”
Section: Introductionmentioning
confidence: 99%
“…They had logical conditions on both row and column plus data-terms from both the previous and next columns for each neuron. This allows us to avoid the problems with convergence and performance, as described by Wilson and Pawley [13], -------------------------------------------------------------------------------- In Figure 3 the convergence of a 155-neural network with 31 rows and five columns for clustering 31 pieces of evidence into five subset is shown. This leads here to a global optimum being found in 51 iterations.…”
Section: Neural Structurementioning
confidence: 99%