2024
DOI: 10.1109/jstars.2024.3378348
|View full text |Cite
|
Sign up to set email alerts
|

Multimodal Colearning Meets Remote Sensing: Taxonomy, State of the Art, and Future Works

Nhi Kieu,
Kien Nguyen,
Abdullah Nazib
et al.

Abstract: In remote sensing (RS), multiple modalities of data are usually available, e.g., RGB, Multispectral, Hyperspectral, LiDAR, and SAR. Multimodal machine learning systems, which fuse these rich multimodal data modalities, have shown better performance compared to unimodal systems. Most multimodal research assumes that all modalities are present, aligned, and noiseless during training and testing time. However, in real-world scenarios, it is common to observe that one or more modalities are missing, noisy, and non… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 182 publications
(235 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?