2020
DOI: 10.1016/j.cviu.2019.102863
|View full text |Cite
|
Sign up to set email alerts
|

An Entropic Optimal Transport loss for learning deep neural networks under label noise in remote sensing images

Abstract: Deep neural networks have established as a powerful tool for large scale supervised classification tasks. The state-of-the-art performances of deep neural networks are conditioned to the availability of large number of accurately labeled samples. In practice, collecting large scale accurately labeled datasets is a challenging and tedious task in most scenarios of remote sensing image analysis, thus cheap surrogate procedures are employed to label the dataset. Training deep neural networks on such datasets with… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 28 publications
(14 citation statements)
references
References 47 publications
0
14
0
Order By: Relevance
“…We see that entropy helps getting slightly better results however when the entropic regularization is too high, the accuracy falls. We conjecture that entropic regularized OT regularizes the neural network because the target prediction is matched to a smoothed source label (see a similar discussion in (Damodaran et al, 2019)). And it is well known that label smoothing creates class clusters in the penultimate layer of the neural network (Müller et al, 2019).…”
Section: C3 Sensitivity Analysismentioning
confidence: 81%
“…We see that entropy helps getting slightly better results however when the entropic regularization is too high, the accuracy falls. We conjecture that entropic regularized OT regularizes the neural network because the target prediction is matched to a smoothed source label (see a similar discussion in (Damodaran et al, 2019)). And it is well known that label smoothing creates class clusters in the penultimate layer of the neural network (Müller et al, 2019).…”
Section: C3 Sensitivity Analysismentioning
confidence: 81%
“…It is known that this could introduce label noise or mislabeled samples in the training set due to several factors such as misregistration, out-dated maps and databases, etc. Thus, in a future work we would like to consider label noise robust classification methods [42,43] to improve the classification performance, and also to incorporate more valuable information and efficient techniques to further reduce the computation time while still increasing the classification accuracy.…”
Section: Discussionmentioning
confidence: 99%
“…After observing the features of images and division of scene classes, asymmetric noise was generated by mapping chaparral → agricultural, runway ↔ airplane, tennis court → baseball diamond, river → beach, mobile home park → parking lot, f reeway ↔ overpass, sparse residential → buildings, harbor → mobile home park, medium residential ↔ dense residential as shown in Figure 4. For NWPU-RESISC45, baseball diamond → medium residential, beach → river, dense residential ↔ medium residential, intersection → f reeway, mobile home park ↔ dense residential, overpass ↔ intersection, tennis court → medium residential, runway → f reeway, thermal power station → cloud, wetland → lake, rectangular f arm land → meadow, church → palace, commercial area → dense residential are mapped, following [12]. Figure 5 shows representative images in this dataset.…”
Section: Methodsmentioning
confidence: 99%
“…For AID, the classes are flipped by mapping bareland ↔ desert; center → storage tank; church → center, storage tank; dense residential ↔ medium residential; industrial → medium residential; meadow → f arm land; play ground → meadow, school; resort → medium residential; school → medium residential, play ground; stadium → play ground, following [12]. Figure 6 shows examples from this dataset.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation