2005
DOI: 10.1007/11552253_42
|View full text |Cite
|
Sign up to set email alerts
|

Regularized Least-Squares for Parse Ranking

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
28
0

Year Published

2006
2006
2010
2010

Publication Types

Select...
3
2

Relationship

4
1

Authors

Journals

citations
Cited by 10 publications
(28 citation statements)
references
References 6 publications
0
28
0
Order By: Relevance
“…For a more detailed description, we refer to Tsivtsivadze et al (2005). Next, we describe the task of parse ranking.…”
Section: Ranking Of Dependency Parsesmentioning
confidence: 99%
See 2 more Smart Citations
“…For a more detailed description, we refer to Tsivtsivadze et al (2005). Next, we describe the task of parse ranking.…”
Section: Ranking Of Dependency Parsesmentioning
confidence: 99%
“…However, the ranking performance of the heuristics has been found to be poor when applied to biomedical text (Pyysalo et al 2006), and hence subsequent ranking or selection methods are needed. In our previous studies, we used regularized least-squares regression for the reranking task that notably outperformed the LG heuristics (Tsivtsivadze et al 2005).…”
Section: Ranking Of Dependency Parsesmentioning
confidence: 99%
See 1 more Smart Citation
“…Recently, we proposed a method for dependency parse ranking [8] that uses Regularized Least-Squares (RLS) algorithm [9] and grammatically motivated features. The method, called RLS ranker, worked notably better giving 0.42 correlation compared to 0.16 of the LG heuristics.…”
Section: Introductionmentioning
confidence: 99%
“…In this paper, we propose a Locality-Convolution (LC) kernel that provides a correlation of 0.46 when used in RLS algorithm. In all experiments, we applied the F-score based parse goodness function [8], and evaluated the ranking performance with Kendall's correlation coefficient τ b described in [10].…”
Section: Introductionmentioning
confidence: 99%