2019 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR) 2019
DOI: 10.1109/ssrr.2019.8848981
|View full text |Cite
|
Sign up to set email alerts
|

Bots2ReC: Radar Localization in Low Visibility Indoor Environments

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(8 citation statements)
references
References 11 publications
0
8
0
Order By: Relevance
“…Further, in this scenario it is more important to establish a robust track than to initialize tracks faster (which would be the more general case). In prior work (Mandischer et al, 2019), we have indicated that radar has no impairments by smoke or dust (see also Fang et al, 2022;Zhang et al, 2022), therefore, we assume that the results are similar to the firefighting use-case. However, future tests will need to support this assumption.…”
Section: Discussionmentioning
confidence: 86%
See 1 more Smart Citation
“…Further, in this scenario it is more important to establish a robust track than to initialize tracks faster (which would be the more general case). In prior work (Mandischer et al, 2019), we have indicated that radar has no impairments by smoke or dust (see also Fang et al, 2022;Zhang et al, 2022), therefore, we assume that the results are similar to the firefighting use-case. However, future tests will need to support this assumption.…”
Section: Discussionmentioning
confidence: 86%
“…In prior work (Mandischer et al, 2019), we observed that the high level of noise in radar scans deployed in indoor environments may be lowered by applying an adaptive threshold. Therefore, the radar scan is initially thresholded using the approach proposed by Otsu (1979).…”
Section: Radar Prefilteringmentioning
confidence: 99%
“…Moreover, a multi-modality framework has proposed to cope with difficulties in percepting the environment around the vehicle at low visibility. In such frameworks, the robot's pose was estimated by using a multi-sensory data fusion technique, i.e., thermal imager with IMU [14], [201], eventbased camera with IMU [196], [197], [221], thermal imager and LiDAR measurements [16], Radar and LiDAR [214], and more than two sensory data [199], [200], [219].…”
Section: Discussion and Future Research Directionsmentioning
confidence: 99%
“…Recently, two localization techniques in low visibility environments were proposed by Mandischer et al [214], a novel radar-based SLAM and another radar-based localization strategy employing laser maps. These approaches are evaluated in indoor environments with heavy dust formulation to emulate vision scenarios of the grinding process.…”
Section: Localization Techniques In Low-visibility Environmentsmentioning
confidence: 99%
“…The system delivers a small change in distance (6% with outliers) during a thick smoke, as well as during a light smoke with high temperature test. Mandischer et al 39 introduce a strategy to steer in low visibility environments with radar and an efficient radar filtering process. The process consists of three steps and enables a consistent noise reduction.…”
Section: Radarmentioning
confidence: 99%