2009
DOI: 10.1007/s10994-008-5097-z
|View full text |Cite
|
Sign up to set email alerts
|

An efficient algorithm for learning to rank from preference graphs

Abstract: In this paper, we introduce a framework for regularized least-squares (RLS) type of ranking cost functions and we propose three such cost functions. Further, we propose a kernel-based preference learning algorithm, which we call RankRLS, for minimizing these functions. It is shown that RankRLS has many computational advantages compared to the ranking algorithms that are based on minimizing other types of costs, such as the hinge cost. In particular, we present efficient algorithms for training, parameter selec… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
56
0

Year Published

2009
2009
2011
2011

Publication Types

Select...
5

Relationship

4
1

Authors

Journals

citations
Cited by 54 publications
(57 citation statements)
references
References 31 publications
(38 reference statements)
1
56
0
Order By: Relevance
“…One of the earliest and most successful of these methods is the ranking support vector machine RankSVM [8], which optimizes the pairwise hinge loss. Even much more closely related is the ranking regularized least-squares method RankRLS [9,10], previously proposed by some of the present authors. The method is based on minimizing the pairwise regularized squared loss and becomes equivalent to the algorithms proposed in this article, if it is trained directly on the relation graph edges.…”
Section: Links With Existing Ranking Methodsmentioning
confidence: 92%
See 3 more Smart Citations
“…One of the earliest and most successful of these methods is the ranking support vector machine RankSVM [8], which optimizes the pairwise hinge loss. Even much more closely related is the ranking regularized least-squares method RankRLS [9,10], previously proposed by some of the present authors. The method is based on minimizing the pairwise regularized squared loss and becomes equivalent to the algorithms proposed in this article, if it is trained directly on the relation graph edges.…”
Section: Links With Existing Ranking Methodsmentioning
confidence: 92%
“…In recent years, several algorithms for learning to rank have been proposed, which can be used for conditional ranking, by interpreting the conditioning node as a query (see e.g. [15,8,5,10,16]). The main application has been in information retrieval, where the examples are joint feature representations of queries and documents, and preferences are induced only between documents connected to the same query.…”
Section: Links With Existing Ranking Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…To this end, we train the regularized least-squares (RLS) algorithm to regress the relation values [40]. Extensive empirical results have been reported for reciprocal relations in [41], as a consequence we focus in this article on symmetric relations.…”
Section: An Illustration In Document Retrievalmentioning
confidence: 99%