With recent advances in sensing, multimodal data is becoming easily available for various applications, especially in remote sensing (RS), where many data types like multispectral imagery (MSI), hyperspectral imagery (HSI), Li-DAR etc. are available. Effective fusion of these multisource datasets is becoming important, for these multimodality features have been shown to generate highly accurate land-cover maps. However, fusion in the context of RS is non-trivial considering the redundancy involved in the data and the large domain differences among multiple modalities. In addition, the feature extraction modules for different modalities hardly interact among themselves, which further limits their semantic relatedness. As a remedy, we propose a feature fusion and extraction framework, namely FusAtNet, for collective land-cover classification of HSIs and LiDAR data in this paper. The proposed framework effectively utilizses HSI modality to generate an attention map using "self-attention" mechanism that highlights its own spectral features. Similarly, a "crossattention" approach is simultaneously used to harness the LiDAR derived attention map that accentuates the spatial features of HSI. These attentive spectral and spatial representations are then explored further along with the original data to obtain modality-specific feature embeddings. The modality oriented joint spectro-spatial information thus obtained, is subsequently utilized to carry out the land-cover classification task. Experimental evaluations on three HSI-LiDAR datasets show that the proposed method achieves the state-of-the-art classification performance, including on the largest HSI-LiDAR dataset available, University of Houston (Data Fusion Contest-2013), opening new avenues in multimodal feature fusion for classification. * Equal Contribution Corresponding Author Classification Module Fusion Black Box HSI (Visible shown) LiDAR Classification Map Fused Joint Representation Figure 1. Generic schematic of a multimodal fusion based classification task. The objective is to effectively combine the two modalities (hereby HSI and LiDAR) such that the resultant representation has rich, fused features that are relevant and robust enough for accurate classification.
One of the vital growth nutrient parameters of crops is soil nitrogen content. The ability to accurately grasp soil nutrient information is a prerequisite for scientific fertilization within the field of precision agriculture. Information pertaining to soil macronutrients, such as Nitrogen, may be obtained quickly through hyperspectral imaging techniques. Objective of this research is to explore the use of a deep learning network to estimate the abundance of urea fertilizer mixed soils for spectroradiometer data. The proposed approach was tested for silt clay and loamy types of soils. Spectral regions of 1899.2 nm for urea and 2195.1 nm for soils were identified as optimum spectral absorption features. The accuracy evaluation was performed using a linear regression model between actual and estimated abundances. At 1899.2 nm, the coefficient of determination (R 2) for mixed samples of urea and silt clay soil was found to be 0.945, whilst R 2 for urea mixed loamy soil were 0.954. Similarly, at 2195.1 nm, R 2 obtained 0.953 for urea mixed silt clay soil, and 0.944 for urea mixed loamy soil. The results show that the estimated abundances obtained through the derivative analysis for spectral unmixing (DASU)-based deep learning network, facilitated a greater accuracy in comparison to the sole use of DASU. These results were then verified through conventional chemical analysis methods. The outcome of this study determines the abundance of urea mixed soils. Therefore, it is inferred that the hyperspectral imaging technique may be utilized in-situ to assess the agricultural land's soil fertility status.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.