CVPR 2011 2011
DOI: 10.1109/cvpr.2011.5995363
|View full text |Cite
|
Sign up to set email alerts
|

AdaBoost on low-rank PSD matrices for metric learning

Abstract: The problem of learning a proper distance or similarity metric arises in many applications such as content-based image retrieval. In this work, we propose a boosting algorithm, MetricBoost, to learn the distance metric that preserves the proximity relationships among object triplets: object i is more similar to object j than to object k. MetricBoost constructs a positive semi-definite (PSD) matrix that parameterizes the distance metric by combining rank-one PSD matrices. Different options of weak models and co… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
20
0

Year Published

2012
2012
2017
2017

Publication Types

Select...
3
2
2

Relationship

1
6

Authors

Journals

citations
Cited by 28 publications
(20 citation statements)
references
References 19 publications
0
20
0
Order By: Relevance
“…Such an ensemble of weak learners has been proven to be more powerful than a single complex classifier and has better generalization performance [7]. In [8] and [9], a boosting algorithm has been implemented for learning a full distance metric, which has motivated the proposed algorithm in this paper. Their important theorem on trace-one semi-definite matrices is central to the theoretical basis of our approach.…”
Section: Metric Learning Via Boostingmentioning
confidence: 99%
“…Such an ensemble of weak learners has been proven to be more powerful than a single complex classifier and has better generalization performance [7]. In [8] and [9], a boosting algorithm has been implemented for learning a full distance metric, which has motivated the proposed algorithm in this paper. Their important theorem on trace-one semi-definite matrices is central to the theoretical basis of our approach.…”
Section: Metric Learning Via Boostingmentioning
confidence: 99%
“…The first and second weak learners are complementary as they use different face regions for measuring distances and the resulting same labeled pairs with large distances are different. on AR-Scarf and also comparative convergence results with BoostMetric [29] and AdaboostMetric [4]. In the following, ESPAC-NS denotes the proposed method with no feature selection.…”
Section: Visualization Of Sparse Structures Ofmentioning
confidence: 99%
“…Therefore, rank-one PSD matrices parameterized weak metrics are learned sequentially. In [4], Bi et al proposed to learn Mahalanobis metrics in an Adaboost manner.…”
Section: Single-modal Metric Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…Recent works such as have shown how interesting it is to learn an optimal metric for a particular task using Metric Learning (ML). Most approaches learn a Mahalanobis metric based on an objective function whose constraints comes either from a set of labelled examples or, more frequently, from sets of positive (same class) and negative (different class) pairs.In the recent works that have used metric learning (ML), the works of Shen et al [4] and Bi et al [2], introducing algorithms based on Boosting approach, deserves a particular attention due to the interesting properties they offers: i) they are efficient and scalable, as no semidefinite programming is required, at each iteration only the largest eigenvalue and corresponding eigenvectors are needed, ii) Like AdaBoost, they don't have any parameter to tune and is easy to implement as no sophisticated optimization techniques are involved. It hence contrasts with most of the commonly used ML methods for which hyper-parameters, often introduced for regularization purpose, have to be adjusted by cross-validation.…”
mentioning
confidence: 99%