2012
DOI: 10.1016/j.robot.2011.08.009
|View full text |Cite
|
Sign up to set email alerts
|

Robust estimation of planar surfaces using spatio-temporal RANSAC for applications in autonomous vehicle navigation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
11
0

Year Published

2012
2012
2021
2021

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 29 publications
(12 citation statements)
references
References 61 publications
(69 reference statements)
0
11
0
Order By: Relevance
“…Similar to us, Mufti et al [MMH12] introduce a spatio‐temporal RANSAC algorithm for dynamic scan processing. Nevertheless, their setup consists of a moving ToF camera scanning a 3D outdoor scene and their goal is the detection of the planar (static) ground.…”
Section: Related Workmentioning
confidence: 99%
“…Similar to us, Mufti et al [MMH12] introduce a spatio‐temporal RANSAC algorithm for dynamic scan processing. Nevertheless, their setup consists of a moving ToF camera scanning a 3D outdoor scene and their goal is the detection of the planar (static) ground.…”
Section: Related Workmentioning
confidence: 99%
“…Matulic et al [8] developed an ubiquitous projection method to create an immersive interactive environment using RANSAC to extract planar surfaces. Mufti et al [9] used time information to estimate planar surfaces and overcome the low resolution and the measurement errors of infrared time-of-flight cameras for autonomous vehicle navigation coupled with RANSAC to find the planes.…”
Section: Detection Of Planar Surfacesmentioning
confidence: 99%
“…Oniga et al [21] utilized a random sample consensus (RANSAC) algorithm to detect a road surface and cluster obstacles based on the density of the sensed points, and Mufti et al [22] presented a spatio-temporal RANSAC framework to detect planar surfaces. Based on the planar features of the ground, the detected area was then segmented.…”
Section: Related Workmentioning
confidence: 99%