Hyperspectral remote sensing data can be used for civil and military applications to detect and classify target objects that cannot be reliably separated using broadband sensors. The comparably low spatial resolution is compensated by the fact that small targets, even below image resolution, can still be classiffed. The goal of this paper is to determine the target size to spatial resolution ratio for successful classiffcation of different target and background materials. Airborne hyperspectral data is used to simulate data with known mixture ratios and to estimate the detection threshold for given false alarm rates. The data was collected in July 2014 over Greding, Germany, using airborne aisaEAGLE and aisaHAWK hyperspectral sensors. On the ground, various target materials were placed on natural background. The targets were four quadratic molton patches with an edge length of 7 meters in the colors black, white, grey and green. Also, two different types of polyethylene (camouage nets) with an edge length of approximately 5.5 meters were deployed. Synthetic data is generated from the original data using spectral mixtures. Target signatures are linearly combined with different background materials in specific ratios. The simulated mixtures are appended to the original data and the target areas are removed for evaluation. Commonly used classiffcation algorithms, e.g. Matched Filtering, Adaptive Cosine Estimator are used to determine the detection limit. Fixed false alarm rates are employed to find and analyze certain regions where false alarms usually occur first. A combination of 18 targets and 12 backgrounds is analyzed for three VNIR and two SWIR data sets of the same area. ABSTRACTHyperspectral remote sensing data can be used for civil and military applications to detect and classify target objects that cannot be reliably separated using broadband sensors. The comparably low spatial resolution is compensated by the fact that small targets, even below image resolution, can still be classified. The goal of this paper is to determine the target size to spatial resolution ratio for successful classification of different target and background materials. Airborne hyperspectral data is used to simulate data with known mixture ratios and to estimate the detection threshold for given false alarm rates. The data was collected in July 2014 over Greding, Germany, using airborne aisaEAGLE and aisaHAWK hyperspectral sensors. On the ground, various target materials were placed on natural background. The targets were four quadratic molton patches with an edge length of 7 meters in the colors black, white, grey and green. Also, two different types of polyethylene (camouflage nets) with an edge length of approximately 5.5 meters were deployed. Synthetic data is generated from the original data using spectral mixtures. Target signatures are linearly combined with different background materials in specific ratios. The simulated mixtures are appended to the original data and the target areas are removed for evaluation. ...
This paper suggests a method for automatic in-flight boresight calibration of pushbroom scanner images, using an on-line system with broadband data downlink and near realtime georeferencing of the pushbroom image data. Georeferencing accuracy may decrease during long image acquisition flights due to instable atmospheric conditions, which may lead to geometric changes in the flight platform. Orthorectification of (hyperspectral) pushbroom scanner data demands the knowledge of the extrinsic orientation parameters for every exposure. The most crucial parameters for the transformation of the pose obtained by the inertial navigation system (INS) into the projection center of the imaging sensor are the boresight angles. Utilizing a performant ray tracing algorithm and a digital elevation model (DEM), these parameters can be estimated even while flying in uneven and uninhabited areas. Tie points for solving an extended collinear equation are extracted automatically by the SURF algorithm
Robust detection of vehicles in airborne data is a challenging task since a high variation in the object signatures - depending on data resolution - and often a small contrast between objects and background lead to high false classification rates and missed detections. Despite these facts, many applications require reliable results which can be obtained in a short time. In this paper, an object-based approach for vehicle detection in airborne laser scans (ALS) and photogrammetrically reconstructed 2.5D data is described. The focus of this paper lies on a robust object segmentation algorithm as well as the identification of features for a reliable separation between vehicles and background (all none-vehicle objects) on different scenes. The described method is based on three consecutive steps, namely, object segmentation, feature extraction and supervised classification. In the first step, the 2.5D data is segmented and possible targets are identified. The segmentation progress is based on the morphological top-hat filtering, which leaves areas that are smaller than a given filter size and higher (brighter) than their surroundings. The approach is chosen due to the low computational effort of this filter, which allows a fast computation even for large areas. The next step is feature extraction. Based on the initial segmentation, features for every identified object are extracted. In addition to frequently used features like height above ground, object area, or point distribution, more complex features like object planarity, entropy in the intensity image, and lineness measures are used. The last step contains classification of each object. For this purpose, a random forest classifier (RF) using the normalized features extracted in the previous step is chosen. RFs are suitable for high dimensional and nonlinear problems. In contrast to other approaches (e.g. maximum likelihood classifier), RFs achieves good results even with relatively small training samples
This paper shows three experiments from our HyperGreding'19 campaign that combine multitemporal hyperspectral data to address several essential questions in target detection. The experiments were conducted over Greding, Germany, using a Headwall VNIR/SWIR co-aligned sensor mounted on a drone with a flight altitude of 80 m. Additionally, high-resolution aerial RGB data, GPS measurements, and reference data from a field spectrometer were recorded to support the hyperspectral data pre-processing and the evaluation process for the individual experiments. The focus of the experiments is the detectability of camouflage materials and camouflaged objects. When the goal is to transfer hyperspectral analysis to a practical setting, the analysis must be robust regarding realistic and changing conditions. The first experiment investigates the SAM and the SAMZID approaches for change detection to demonstrate their usefulness for target detection of moving objects within the recorded scene. The goal is to eliminate unwanted changes like shadow areas. The second experiment evaluates the detection of different camouflage net types over two days. This includes camouflage nets in shadows during one flight and brightly illuminated in another due to varying solar elevation angles during the day. We demonstrate the performance of typical hyperspectral target detection and classification approaches for robust detection under these conditions. Finally, the third experiment aims to detect objects and materials behind the cover of camouflage nets by using a camouflage garage. We show that some materials can be detected using an unmixing approach.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.