2022
DOI: 10.3390/f13030383
|View full text |Cite
|
Sign up to set email alerts
|

A Vision-Based Detection and Spatial Localization Scheme for Forest Fire Inspection from UAV

Abstract: Forest fires have the characteristics of strong unpredictability and extreme destruction. Hence, it is difficult to carry out effective prevention and control. Once the fire spreads, devastating damage will be caused to natural resources and the ecological environment. In order to detect early forest fires in real-time and provide firefighting assistance, we propose a vision-based detection and spatial localization scheme and develop a system carried on the unmanned aerial vehicle (UAV) with an OAK-D camera. D… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 27 publications
(16 citation statements)
references
References 39 publications
(50 reference statements)
0
16
0
Order By: Relevance
“…The model proposed in this paper performed cross-sectional performance comparison experiments with lightweight detection networks that were currently widely used in multiple fields. The selected comparison networks included the lightweight one-stage detection model YOLOX [ 91 ], the two-stage lightweight detection network ThunderNet [ 92 ], the ultra-lightweight Anchor Free detection network NanoDet [ 93 ], and detection networks using the lightweight backbone networks, such as ShuffleNetV2 [ 94 ] and MobileNetV3 [ 95 ], combined with the detectors proposed in this paper. The comparison experiments were subdivided into the detection of targets when shooting horizontally versus vertically, and the results of the comparison tests were shown in Figure 17 , Figure 18 , and Figure 19 , respectively.…”
Section: Results Analysis and Discussionmentioning
confidence: 99%
“…The model proposed in this paper performed cross-sectional performance comparison experiments with lightweight detection networks that were currently widely used in multiple fields. The selected comparison networks included the lightweight one-stage detection model YOLOX [ 91 ], the two-stage lightweight detection network ThunderNet [ 92 ], the ultra-lightweight Anchor Free detection network NanoDet [ 93 ], and detection networks using the lightweight backbone networks, such as ShuffleNetV2 [ 94 ] and MobileNetV3 [ 95 ], combined with the detectors proposed in this paper. The comparison experiments were subdivided into the detection of targets when shooting horizontally versus vertically, and the results of the comparison tests were shown in Figure 17 , Figure 18 , and Figure 19 , respectively.…”
Section: Results Analysis and Discussionmentioning
confidence: 99%
“…Spatial pyramid pooling is a multi-scale feature fusion pooling method that can preserve object features well [30,31], maintain feature map shape, and output fixed-size features with any feature image size as input. The pooling operation, as one of the most basic algorithms for image processing in the field of deep learning, can reduce the size of the feature map of the model by retaining some features while reducing the computational cost, preventing overfitting, and improving the model's generalization ability.…”
Section: Improved Spatial Pyramidal Pooling Structurementioning
confidence: 99%
“…) 𝑌 𝑋 (4) By maximizing the mutual information, the feature separability also maximised. MIFS selects highly informative features with mutual information as a way of measuring the relationship between two random variables.…”
Section: 𝐼(𝑋 𝑌) = ∑ ∑ 𝑃(𝑋 𝑌) 𝑙𝑜𝑔 ( 𝑃(𝑋𝑌) 𝑃(𝑋)𝑃(𝑌)mentioning
confidence: 99%
“…However, establishing a precise and reliable fire detection method is challenging it terms of localization, coverage range and life span of sensor network [3]. Thus, vision-based techniques address the challenge and perform the detection task through aerial images or video and assist in notifying the firefighter's team about the exact fire location [4].…”
Section: Introductionmentioning
confidence: 99%