2022
DOI: 10.1109/tnnls.2021.3089023
|View full text |Cite
|
Sign up to set email alerts
|

Finding Optimal Paths Using Networks Without Learning—Unifying Classical Approaches

Abstract: Trajectory or path planning is a fundamental issue in a wide variety of applications. In this article, we show that it is possible to solve path planning on a maze for multiple start point and endpoint highly efficiently with a novel configuration of multilayer networks that use only weighted pooling operations, for which no network training is needed. These networks create solutions, which are identical to those from classical algorithms such as breadth-first search (BFS), Dijkstra's algorithm, or TD(0). Diff… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 36 publications
0
3
0
Order By: Relevance
“…These approaches, however, may not always give optimal solutions or may fail to find solutions at all, especially for larger graphs. Some other bioinspired neural networks were proposed [22], [23], [24], [25], [26], [27], [28] for solving path-planning problems. These approaches work on grid structures, where activity in the network is propagated from the source neuron to the neighboring neurons until activity propagation within the whole network is finished.…”
Section: A State Of the Artmentioning
confidence: 99%
“…These approaches, however, may not always give optimal solutions or may fail to find solutions at all, especially for larger graphs. Some other bioinspired neural networks were proposed [22], [23], [24], [25], [26], [27], [28] for solving path-planning problems. These approaches work on grid structures, where activity in the network is propagated from the source neuron to the neighboring neurons until activity propagation within the whole network is finished.…”
Section: A State Of the Artmentioning
confidence: 99%
“…Some other bio-inspired neural networks were proposed [22], [23], [24], [25], [26], [27], [28] for solving path planning problems. These approaches work on grid structures, where activity in the network is propagated from the source neuron to the neighbouring neurons until activity propagation within the whole network is finished.…”
Section: State-of-the-artmentioning
confidence: 99%
“…2022 3713-3726 Heiden, U., see Hong, D., TNNLS Nov. 2022 6518-6531 Helaoui, M., see Hu, X., TNNLS Aug. 2022 3923-3937 Heng, P., see Liu, W., TNNLS Sept. 2022 4785-4799 Heo, B., see Ro, Y., TNNLS Sept. 2022 4648-4660 Herzog, S., see Kulvicius, T., TNNLS Dec. 2022 Huang, T., Deng, L., Jiang, T., Vivone, G., and Chanussot, J., Hyper-spectral Image Super-Resolution via Deep Spatiospectral Attention Convolutional Neural Networks; TNNLS Dec. 2022 7251-7265 Hu, L., see Wang, K., TNNLS May 2022 2159-2167 Hu, P., see Zhen, L., TNNLS Feb. 2022 798-810 Hu, Q., Zhang, H., Gao, F., Xing, C., and An, J., Hu, X., Liu, Z., Yu, X., Zhao, Y., Chen, W., Hu, B., Du, X., Li, X., Helaoui, M., Wang, W., and Ghannouchi, F.M., Convolutional Neural Network for Behavioral Modeling and Predistortion of Wideband Power Amplifiers; TNNLS Aug. 2022 3923-3937 Hu, X., see Bian, Y., TNNLS Dec. 2022 7928-7936 Hu, Y., see Guo, J., TNNLS July 2022 3157-3170 Hu, Y., see Ji, Q., TNNLS Oct. 2022 5681-5693 Hu, Y., Subagdja, B., Tan, A., and Yin, Q., Vision-Based Topological Mapping and Navigation With Self-Organizing Neural Networks; TNNLS 7101-7113 Hu, Z., Ren, H., and Shi, P., Zhao, R., Bi, L., Zhang, D., and Lu, C., Neural Embedding Singular Value Decomposition for Collaborative Filtering; TNNLS Oct. 2022 6021-6029 Huang, T., see He, X., 7818-7828 Huang, T., see Hu, J., 7251-7265 Huang, T., see Dong, T., TNNLS Dec. 2022 7266-7276 Huang, W., see Wang, Q., 1414-1428 Huang, W., Wu, H., Chen, Q., Luo, C., Zeng, S., Li, T., and Huang, Y., TNNLS Sept. 2022 4945-4959 Nguang, S.K., see Peng, Z., TNNLS Aug. 2022 4043-4055 Nguang, S.K., see Yan, S., 6905-6915 Nguyen, T., Roy, S., and Meunier, J., SmithNet: Strictness on Motion-Texture Coherence for Anomaly Detection; TNNLS June 2022 2287-2300 Ni, J., Huang, Z., Yu, C., Lv, D., and Wang, C., Comparative Convolutional Dynamic Multi-Attention Recommendation Model; TNNLS Aug. 2022 3510-3521 Ni, X., see Wen, S., TNNLS July 2022 3109-3119 Ni, Y., see Wang, Z., TNNLS Dec. 2022 7610-7620 Ni, Z., see Mu, C., TNNLS Sept. 2022 4437-4450 Nicolaou, M.A., see Oldfield, J., TNNLS Aug.…”
mentioning
confidence: 99%
“…Kulvicius, T., TNNLS Dec. 2022 7877-7887 Wu, B., see Zhao, S., TNNLS Feb. 2022 473-493 Wu, D., Shang, M., Luo, X., and Wang, Z., An L 1 -and-L 2 -Norm-Oriented Latent Factor Model for Recommender Systems; TNNLS Oct. 2022 5775-5788 Wu, D., Nie, F., Dong, X., Wang, R., and Li, X., Parameter-Free Consensus Embedding Learning for Multiview Graph-Based Clustering; TNNLS Dec.…”
mentioning
confidence: 99%