2015
DOI: 10.1007/978-3-319-23528-8_14
|View full text |Cite
|
Sign up to set email alerts
|

A Kernel-Learning Approach to Semi-supervised Clustering with Relative Distance Comparisons

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 12 publications
(7 citation statements)
references
References 15 publications
0
7
0
Order By: Relevance
“…However, LDA fails if the conditional densities of the data are not multivariate Gaussian [42]. Therefore, we will consider the most reliable method, such as a relative distance comparison method [43] or triplet method [44] to preserve discriminant information of both domains.…”
Section: Discussionmentioning
confidence: 99%
“…However, LDA fails if the conditional densities of the data are not multivariate Gaussian [42]. Therefore, we will consider the most reliable method, such as a relative distance comparison method [43] or triplet method [44] to preserve discriminant information of both domains.…”
Section: Discussionmentioning
confidence: 99%
“…In [21], Mahadevan et al considered the Riemannian geometry of covariance matrices to minimize geometrical and statistical shifts between domains while learning a metric. Amid et al [22], proposed an algorithm that learns a kernel matrix using the log-determinant divergence that is subject to a set of relative distance constraints. Dai et al proposed an alg orithm called EigenTransfer [23] which learns the spectra of a graph that is obtained from the learning task to obtain eigenvectors that are able to accurately capture the structure of the graph.…”
Section: Related Workmentioning
confidence: 99%
“…The distances between data points in ML constraints should be minimum, while the distances between those in CL constraints must be large. But, according to [25,26], the relative-distance constraints like inequality constraints set (C neq ) and equality constraintsset (C eq ) are particularly effective in expressing structures at a finer level of detail when compared to ML and CL constraints. Therefore, in our proposed work, we consider relative distance constraints as side information additionally to the feature vector for preserving the discriminate information.…”
Section: Introductionmentioning
confidence: 99%
“…In relative distance learning, the objective is to look for a novel distance function between data points, taking into account feature vector and relative distance constraints. We depart from existing literature [25] by eliciting every relative distance comparison constraint with a question "Which one of the data points "i", "j", and "k" is the least similar(or dissimilar) to the other two data points? "…”
Section: Introductionmentioning
confidence: 99%