Second Meeting of the North American Chapter of the Association for Computational Linguistics on Language Technologies 2001 - 2001
DOI: 10.3115/1073336.1073344
|View full text |Cite
|
Sign up to set email alerts
|

Text and knowledge mining for coreference resolution

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

2
36
1
4

Year Published

2008
2008
2018
2018

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 48 publications
(43 citation statements)
references
References 13 publications
2
36
1
4
Order By: Relevance
“…Others such as Ng and Cardie [7] and Harabagiu et al [10] also try to filter out less important or very easy positive instances to force the learning algorithm to specialize on the more difficult cases. Ng and Cardie [7] propose both negative sample selection (the reduction of the number of negative instances) and positive sample selection (the reduction of the number of positive instances), both under-sampling strategies aiming to create a better coreference resolution system.…”
Section: Related Research On Instance Selectionmentioning
confidence: 99%
See 4 more Smart Citations
“…Others such as Ng and Cardie [7] and Harabagiu et al [10] also try to filter out less important or very easy positive instances to force the learning algorithm to specialize on the more difficult cases. Ng and Cardie [7] propose both negative sample selection (the reduction of the number of negative instances) and positive sample selection (the reduction of the number of positive instances), both under-sampling strategies aiming to create a better coreference resolution system.…”
Section: Related Research On Instance Selectionmentioning
confidence: 99%
“…Ng and Cardie [7] propose both negative sample selection (the reduction of the number of negative instances) and positive sample selection (the reduction of the number of positive instances), both under-sampling strategies aiming to create a better coreference resolution system. Given the observation that one antecedent is sufficient to resolve an anaphor, they present a corpus-based method for the selection of easy positive instances, which is inspired by the example selection algorithm introduced in [10]. The assumption is that the easiest types of coreference relationships to resolve are the ones that occur with high frequencies in the training data.…”
Section: Related Research On Instance Selectionmentioning
confidence: 99%
See 3 more Smart Citations