2014 IEEE/ION Position, Location and Navigation Symposium - PLANS 2014 2014
DOI: 10.1109/plans.2014.6851426
|View full text |Cite
|
Sign up to set email alerts
|

Chameleon on fire — Thermal infrared indoor positioning

Abstract: In this paper we present a system for positioning and mapping, primarily for use in smoke diver applications. The system is based on a stereo pair of thermal infrared cameras and is shown to produce trajectory and mapping estimates while used in environments with sufficient thermal contrast. The system is evaluated in, e.g., a test facility for smoke divers, with good results.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0
1

Year Published

2014
2014
2024
2024

Publication Types

Select...
2
2
2

Relationship

0
6

Authors

Journals

citations
Cited by 14 publications
(11 citation statements)
references
References 5 publications
0
8
0
1
Order By: Relevance
“…The choice to compare against state‐of‐the‐art visual and visual‐inertial approaches was motivated by the observation that, to the authors’ best knowledge, of the few methods that utilize thermal cameras for odometry estimation, none of them utilized full radiometric data but instead chose to operate on rescaled thermal imagery. This, in turn, makes these methods, and also the very few approaches that fused inertial cues (Emilsson & Rydell, ; Papachristos et al, ), more in line with pure visual odometry‐based solutions. Hence, it is meaningful to directly compare against state‐of‐the‐art visual and visual‐inertial approaches, open‐sourced and available in their original implementations, operating on rescaled thermal images.…”
Section: Experimental Evaluationmentioning
confidence: 86%
See 1 more Smart Citation
“…The choice to compare against state‐of‐the‐art visual and visual‐inertial approaches was motivated by the observation that, to the authors’ best knowledge, of the few methods that utilize thermal cameras for odometry estimation, none of them utilized full radiometric data but instead chose to operate on rescaled thermal imagery. This, in turn, makes these methods, and also the very few approaches that fused inertial cues (Emilsson & Rydell, ; Papachristos et al, ), more in line with pure visual odometry‐based solutions. Hence, it is meaningful to directly compare against state‐of‐the‐art visual and visual‐inertial approaches, open‐sourced and available in their original implementations, operating on rescaled thermal images.…”
Section: Experimental Evaluationmentioning
confidence: 86%
“…The feasibility of utilizing thermal cameras alongside visual cameras for odometry estimation was demonstrated in Emilsson and Rydell () by the development of a hand‐held unit that could be carried by firefighters when navigating through smoke‐filled buildings. The underlying odometry estimation technique relied on scale‐invariant feature transform (SIFT) features (Lowe, ) to establish image correspondences, however, the authors reported a lower feature matching performance on thermal images due to the scarcity and lack of strength of gradients in rescaled thermal images.…”
Section: Related Workmentioning
confidence: 99%
“…Furthermore, in case of a fire, the visual view is reduced or unavailable due to the smoke particles. This will result in a failure of standard SLAM (simultaneous location and mapping) algorithms [31]. Although, these limitations are widely recognized, there is done already some research in indoor location estimation during a fire.…”
Section: Location Estimation and Understandingmentioning
confidence: 99%
“…An overview of these systems that can be used in emergency situations is described by Bastos et al [32] and Starr et al [33]. Furthermore, Emilsson et al [31] used a stereo-pair of thermal images with IMU (inertial measurement unit) information to create an overview map or a 3D model of visited environments. From our point of view, however, it is not necessary to create a detailed 3D or complex point cloud model for fire location understanding.…”
Section: Location Estimation and Understandingmentioning
confidence: 99%
“…Considering the need for an infrastructure-free localization system and the stringent size, weight, power and cost (SWaP-C) requirements, it is believed that the accuracy and availability requirements can only be fulfilled by embracing a multisensor fusion approach, utilizing sensors with complementary error characteristics [1]. Sensors and localization sub-systems that are being pursued by different research teams include foot-mounted inertial navigation systems (INS) [6][7], back-mounted pedestrian deadreckoning systems [8][9][10][11], magnetometers, barometric sensors (using a reference sensor at a known height to counter effects from weather changes), imaging sensors (including visual [12] and thermal infra-red cameras [13]), Doppler radar [14], radiobased ranging [15] using synthetic aperture approaches [16], and cooperative localization approaches [17][18].…”
Section: Introductionmentioning
confidence: 99%