2021
DOI: 10.1109/tcyb.2019.2916198
|View full text |Cite
|
Sign up to set email alerts
|

Learning Kernel for Conditional Moment-Matching Discrepancy-Based Image Classification

Abstract: Conditional Maximum Mean Discrepancy (CMMD) can capture the discrepancy between conditional distributions by drawing support from nonlinear kernel functions, thus it has been successfully used for pattern classification. However, CMMD does not work well on complex distributions, especially when the kernel function fails to correctly characterize the difference between intra-class similarity and inter-class similarity. In this paper, a new kernel learning method is proposed to improve the discrimination perform… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
12
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
7

Relationship

2
5

Authors

Journals

citations
Cited by 14 publications
(12 citation statements)
references
References 30 publications
0
12
0
Order By: Relevance
“…The distribution alignment methods minimize the discrepancy of domains based on common statistics directly, e.g., the first-order statistic based on maximum mean discrepancies (MMD) (Sejdinovic et al 2013;Long et al 2015;Ren et al 2019) and the second-order statistic based on covariance matrices (Sun, Feng, and Saenko 2016;Chen et al 2019a). Inspired by the GANs (Goodfellow et al 2014), lots of adversarial approaches with different purposes are developed.…”
Section: Related Workmentioning
confidence: 99%
“…The distribution alignment methods minimize the discrepancy of domains based on common statistics directly, e.g., the first-order statistic based on maximum mean discrepancies (MMD) (Sejdinovic et al 2013;Long et al 2015;Ren et al 2019) and the second-order statistic based on covariance matrices (Sun, Feng, and Saenko 2016;Chen et al 2019a). Inspired by the GANs (Goodfellow et al 2014), lots of adversarial approaches with different purposes are developed.…”
Section: Related Workmentioning
confidence: 99%
“…Kernel-based pattern analysis has been validated in improving classification performance [48]. To make use of the kernel method for classification, one can directly map the input data onto a reproducing kernel Hilbert space (RKHS) and establish a classifier followed by a cross-entropy (CE) loss.…”
Section: Discriminant Learning Via the Cmmd Lossmentioning
confidence: 99%
“…However, this manner relies on the selection of a suitable kernel function, which is expected to effectively capture the difference between intra-class similarity and inter-class one. To address this problem, Ren et al [48] propose a new kernel learning method for CMMD-based image classification. Motivated by its effectiveness in learning a representative kernel function through placing the CMMD loss in the representation learning phase, we design a CMMD loss in FLARE as follows.…”
Section: Discriminant Learning Via the Cmmd Lossmentioning
confidence: 99%
See 2 more Smart Citations