2023
DOI: 10.1109/tpami.2023.3271451
|View full text |Cite
|
Sign up to set email alerts
|

CMW-Net: Learning a Class-Aware Sample Weighting Mapping for Robust Deep Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
25
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 23 publications
(25 citation statements)
references
References 43 publications
0
25
0
Order By: Relevance
“…The method in Reference 12 reproduced nearly all of the prominent techniques for fine‐grained classification using webly supervised learning: NEIL, 67 WSDG, 68 Sukhbaatar et al, 69 Bergamo et al, 26 Xiao et al 70 In addition, we also directly utilize the performances of Decoupling 22 and Co‐teaching 23 on CUB200‐2011 dataset from Reference 71. The test accuracy of CMW‐Net‐SL 73 is reported and we rerun the code of SELC 72 for comparison.…”
Section: Methodsmentioning
confidence: 99%
“…The method in Reference 12 reproduced nearly all of the prominent techniques for fine‐grained classification using webly supervised learning: NEIL, 67 WSDG, 68 Sukhbaatar et al, 69 Bergamo et al, 26 Xiao et al 70 In addition, we also directly utilize the performances of Decoupling 22 and Co‐teaching 23 on CUB200‐2011 dataset from Reference 71. The test accuracy of CMW‐Net‐SL 73 is reported and we rerun the code of SELC 72 for comparison.…”
Section: Methodsmentioning
confidence: 99%
“…Much research has been done on the class-imbalance problem [15][16][17][18][19][20]40], and different solutions have been proposed to solve this problem including under-sampling and over-sampling [41,42], reconciliation of loss function [15,17,43,44], and learning paradigms such as self-supervised learning [16,45], transfer learning [18], ensemble learning [46,47], metalearning [48], and metric learning [49]. All these methods have been used in the scenario of a single domain and use the data splits for all participants from the same domain, while we extend the data heterogeneity problem to multi-domain and imbalance classes in FL environment.…”
Section: Related Work 21 Class-imbalance and Label Distributionmentioning
confidence: 99%
“…In the training process, when there is a significant disparity between positive and negative samples, the model tends to pay more attention to the majority samples that have a higher frequency of occurrence. 30 So, the representation of minority classes was not learned enough, and the model was more inclined to predict the majority classes and ignore the minority classes. It might lead to the poor classification effect of the model on minority classes.…”
Section: Journal Of Chemical Information and Modelingmentioning
confidence: 99%