2022
DOI: 10.1108/aa-10-2021-0125
|View full text |Cite
|
Sign up to set email alerts
|

Towards extreme learning machine framework for lane detection on unmanned mobile robot

Abstract: Purpose This paper aims to focus on lane detection of unmanned mobile robots. For the mobile robot, it is undesirable to spend lots of time detecting the lane. So quickly detecting the lane in a complex environment such as poor illumination and shadows becomes a challenge. Design/methodology/approach A new learning framework based on an integration of extreme learning machine (ELM) and an inception structure named multiscale ELM is proposed, making full use of the advantages that ELM has faster convergence a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 10 publications
(2 citation statements)
references
References 42 publications
0
2
0
Order By: Relevance
“…In equation ( 8), the ambient temperature is x stands for the input and output data. y stands for the normalization result [20]. The ELM ignores the model accuracy while improving the fitting speed.…”
Section: Input Layermentioning
confidence: 99%
“…In equation ( 8), the ambient temperature is x stands for the input and output data. y stands for the normalization result [20]. The ELM ignores the model accuracy while improving the fitting speed.…”
Section: Input Layermentioning
confidence: 99%
“…In modern technology, eye control technology has gradually matured. In terms of scientific research, eye control systems are often used to study the relationship between human–computer interaction and psychology (Wijayasinghe et al , 2014); the research results of eye control systems are well-received in the medical field, robotics, artificial intelligence, games and other fields (Li et al , 2022b; Huang et al , 2020b; Dai et al , 2022; Li et al , 2021d).…”
Section: Introductionmentioning
confidence: 99%