This paper suggests a method for automatic in-flight boresight calibration of pushbroom scanner images, using an on-line system with broadband data downlink and near realtime georeferencing of the pushbroom image data. Georeferencing accuracy may decrease during long image acquisition flights due to instable atmospheric conditions, which may lead to geometric changes in the flight platform. Orthorectification of (hyperspectral) pushbroom scanner data demands the knowledge of the extrinsic orientation parameters for every exposure. The most crucial parameters for the transformation of the pose obtained by the inertial navigation system (INS) into the projection center of the imaging sensor are the boresight angles. Utilizing a performant ray tracing algorithm and a digital elevation model (DEM), these parameters can be estimated even while flying in uneven and uninhabited areas. Tie points for solving an extended collinear equation are extracted automatically by the SURF algorithm
Modern mission characteristics require the use of advanced imaging sensors in reconnaissance. In particular, high spatial and high spectral resolution imaging provides promising data for many tasks such as classification and detecting objects of military relevance, such as camouflaged units or improvised explosive devices (IEDs). Especially in asymmetric warfare with highly mobile forces, intelligence, surveillance and reconnaissance (ISR) needs to be available close to real-time. This demands the use of unmanned aerial vehicles (UAVs) in combination with downlink capability. The system described in this contribution is integrated in a wing pod for ease of installation and calibration. It is designed for the real-time acquisition and analysis of hyperspectral data. The main component is a Specim AISA Eagle II hyperspectral sensor, covering the visible and near-infrared (VNIR) spectral range with a spectral resolution up to 1.2 nm and 1024 pixel across track, leading to a ground sampling distance below 1 m at typical altitudes. The push broom characteristic of the hyperspectral sensor demands an inertial navigation system (INS) for rectification and georeferencing of the image data. Additional sensors are a high resolution RGB (HR-RGB) frame camera and a thermal imaging camera. For on-line application, the data is preselected, compressed and transmitted to the ground control station (GCS) by an existing system in a second wing pod. The final result after data processing in the GCS is a hyperspectral orthorectified GeoTIFF, which is filed in the ERDAS APOLLO geographical information system. APOLLO allows remote access to the data and offers web-based analysis tools. The system is quasi-operational and was successfully tested in May 2013 in Bremerhaven, Germany
ABSTRACT:Active learning reduces training costs for supervised classification by acquiring ground truth data only for the most useful samples. We present a new concept for the analysis of active learning techniques. Our framework is split into an outer and an inner view to facilitate the assignment of different influences. The main contribution of this paper is a concept of a new compound analysis in the active learning loop. It comprises three sub-analyses: structural, oracle, prediction. They are combined to form a hypothesis of the usefulness for each unlabeled training sample. Though the analysis is in an early stage, different extensions are highlighted. Further we show how variations inside the framework lead to many techniques from the active learning literature. In this work we focus on remote sensing, but the proposed method can be applied to other fields as well.
This paper presents a novel approach for the reduction of training costs in classification with co-registered hyperspectral (HS) and Light Detection and Ranging (LiDAR) data using an active classification framework. Fully automatic classification can be achieved by unsupervised learning, which is not suited for adjustment to specific classes. On the other hand, supervised classification with predefined classes needs a lot of training examples, which need to be labeled with the ground truth, usually at a significant cost. The concept of active classification alleviates these problems by the use of a selection strategy: only selected samples are ground truth labeled and used as training data. One common selection strategy is to incorporate in a first step the current state of the classification algorithm and choose only the examples for which the expected information gain is maximized. In the second step a conventional classification algorithm is trained using this data. By alternating between these two steps the algorithm reaches high classification accuracy results with less training samples and therefore lower training costs. The approach presented in this paper involves the user in the active selection strategy and the k-NN algorithm is chosen for classification. The results further benefit from fusing the heterogeneous information of HS and LiDAR data within the classification algorithm. For this purpose, several HS features, such as vegetation indices, and LiDAR features, such as relative height and roughness, are extracted. This increases the separability between different classes and reduces the dimensionality of the HS data. The practicability and performance of this framework is shown for the detection and separation of different kinds of vegetation, e.g. trees and grass in an urban area of Berlin. The HS data was obtained by the SPECIM AISA Eagle 2 sensor, LiDAR data by Riegl LMS Q560
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.