The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
2016
DOI: 10.3390/s16030362
|View full text |Cite
|
Sign up to set email alerts
|

Vision-Based Steering Control, Speed Assistance and Localization for Inner-City Vehicles

Abstract: Autonomous route following with road vehicles has gained popularity in the last few decades. In order to provide highly automated driver assistance systems, different types and combinations of sensors have been presented in the literature. However, most of these approaches apply quite sophisticated and expensive sensors, and hence, the development of a cost-efficient solution still remains a challenging problem. This work proposes the use of a single monocular camera sensor for an automatic steering control, s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
14
0
1

Year Published

2016
2016
2022
2022

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 18 publications
(15 citation statements)
references
References 43 publications
0
14
0
1
Order By: Relevance
“…Many of these approaches use monocular vision for this task. An example is the work in [ 9 ], where lines painted on the road are detected by a single monocular camera, and an automatic steering control, speed assistance for the driver and localization of the vehicle are presented. In [ 10 ], the authors go one step further, trying to predict pedestrian behavior based on the Gaussian process, dynamical models and probabilistic hierarchical trajectory matching.…”
Section: Previous Workmentioning
confidence: 99%
“…Many of these approaches use monocular vision for this task. An example is the work in [ 9 ], where lines painted on the road are detected by a single monocular camera, and an automatic steering control, speed assistance for the driver and localization of the vehicle are presented. In [ 10 ], the authors go one step further, trying to predict pedestrian behavior based on the Gaussian process, dynamical models and probabilistic hierarchical trajectory matching.…”
Section: Previous Workmentioning
confidence: 99%
“…The primary weakness of GNSS stems from the system’s vulnerability to radio frequency interference [20,21,22,23,24,25,26,27,28] and ionospheric effects [29,30,31,32]. The performance of the vision sensor [33,34] can be impeded by environmental factors such as light and weather conditions [35,36,37]. Because of these factors, detecting a driving lane is not a simple task for autonomous vehicles.…”
Section: Introductionmentioning
confidence: 99%
“…Most of the above control strategies do not take into account the time delays induced by sensors, which has a large impact on the quality and stability of lateral control. Vision-based sensors, such as monocular cameras, are widely used in lane detecting or vehicle localization due to their low cost, and the vehicle-lane information can be obtained reliably through visual algorithms [ 17 , 18 , 19 , 20 , 21 , 22 ]. However, the computational cost of the visual algorithm is relatively large.…”
Section: Introductionmentioning
confidence: 99%