2010
DOI: 10.1007/978-3-642-15883-4_32
|View full text |Cite
|
Sign up to set email alerts
|

Conditional Ranking on Relational Data

Abstract: Abstract. In domains like bioinformatics, information retrieval and social network analysis, one can find learning tasks where the goal consists of inferring a ranking of objects, conditioned on a particular target object. We present a general kernel framework for learning conditional rankings from various types of relational data, where rankings can be conditioned on unseen data objects. Conditional ranking from symmetric or reciprocal relations can in this framework be treated as two important special cases.… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
21
0

Year Published

2011
2011
2018
2018

Publication Types

Select...
4
2

Relationship

4
2

Authors

Journals

citations
Cited by 13 publications
(21 citation statements)
references
References 18 publications
0
21
0
Order By: Relevance
“…Several efficient machine learning algorithms have been proposed for the special case, where the training sets consists of a complete bipartite graph, meaning that each possible start-end vertex pair appears exactly once, and a ridge regression loss is minimized. Specifically [13], [14], [16], [17], [6] derive closed form solutions based on Kronecker algebraic optimization (see also [33] for the basic mathematical results underlying these studies). Further, iterative methods based on Kronecker product kernel matrix -vector multiplications, have been proposed (see e.g.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations
“…Several efficient machine learning algorithms have been proposed for the special case, where the training sets consists of a complete bipartite graph, meaning that each possible start-end vertex pair appears exactly once, and a ridge regression loss is minimized. Specifically [13], [14], [16], [17], [6] derive closed form solutions based on Kronecker algebraic optimization (see also [33] for the basic mathematical results underlying these studies). Further, iterative methods based on Kronecker product kernel matrix -vector multiplications, have been proposed (see e.g.…”
Section: Related Workmentioning
confidence: 99%
“…Further, iterative methods based on Kronecker product kernel matrix -vector multiplications, have been proposed (see e.g. [10], [14], [16]). …”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…When computing the test performance, we consider all the edges in the test set, except those starting and ending at the same node. We train the RLS algorithm using conjugate gradient optimization with early stopping [42], optimization is terminated once the MSE on the validation set has failed to decrease for 10 consecutive iterations. The mean predictor achieves around 145 MSE test performance on this data.…”
Section: An Illustration In Document Retrievalmentioning
confidence: 99%