2016 IEEE International Conference on Consumer Electronics-Asia (ICCE-Asia) 2016
DOI: 10.1109/icce-asia.2016.7804748
|View full text |Cite
|
Sign up to set email alerts
|

Near real-time ego-lane detection in highway and urban streets

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 16 publications
(9 citation statements)
references
References 10 publications
0
9
0
Order By: Relevance
“…The transformation matrix H can be calculated using the equation above. The matrix H is then used to map the input images from the first-person to the output of the bird's eye view perspective using a pixel-by-pixel process [25].…”
Section: Bird's Eye View Transformationmentioning
confidence: 99%
“…The transformation matrix H can be calculated using the equation above. The matrix H is then used to map the input images from the first-person to the output of the bird's eye view perspective using a pixel-by-pixel process [25].…”
Section: Bird's Eye View Transformationmentioning
confidence: 99%
“…Perspective transform is very useful in transforming a view of a lane from a vehicle to a birds-eye view [10]. This is important for correctly calculating lane curvatures and identification [11].…”
Section: Perspective Transformmentioning
confidence: 99%
“…Before lane detection, the camera is generally installed inside the front windshield, so the captured images contain information that is not related to the lane, such as sky, vehicles, trees by the roadside, and so on. Through the warp perspective mapping (WPM) [37] of the images collected by the camera, the obtained bird-view images mainly contain the road surface and lane, further, in the bird-view images, the lane appear in the form of parallel lines, which makes it more convenient for subsequent processing.…”
Section: Lane Detection Algorithmmentioning
confidence: 99%