Proceedings of the Seventh Conference on Natural Language Learning at HLT-NAACL 2003 - 2003
DOI: 10.3115/1119176.1119207
|View full text |Cite
|
Sign up to set email alerts
|

Meta-learning orthographic and contextual models for language independent named entity recognition

Abstract: This paper presents a named entity classification system that utilises both orthographic and contextual information. The random subspace method was employed to generate and refine attribute models. Supervised and unsupervised learning techniques used in the recombination of models to produce the final results.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2003
2003
2009
2009

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 10 publications
(7 citation statements)
references
References 11 publications
(7 reference statements)
0
6
0
Order By: Relevance
“…It is rooted in the theory of stochastic discrimination [28], and it has common points with bagging, but instead of sampling instances, it samples subspaces [29]. RSM has been successfully applied to different problems [30], [31]. For instance, in selection based on genetic algorithms, we can evolve the input subspace instead of using RSM, which is a better solution for the stability of -NN rule with respect to sampling (see Section III-B).…”
Section: A Using Standard Instance Selection Methodsmentioning
confidence: 98%
“…It is rooted in the theory of stochastic discrimination [28], and it has common points with bagging, but instead of sampling instances, it samples subspaces [29]. RSM has been successfully applied to different problems [30], [31]. For instance, in selection based on genetic algorithms, we can evolve the input subspace instead of using RSM, which is a better solution for the stability of -NN rule with respect to sampling (see Section III-B).…”
Section: A Using Standard Instance Selection Methodsmentioning
confidence: 98%
“…Wu et al (2003) applied both stacking and voting to three learners. Munro et al (2003) employed both voting and bagging for combining classifiers.…”
Section: Learning Techniquesmentioning
confidence: 99%
“…A majority vote of five systems (Chieu and Ng, 2003;Florian et al, 2003;Klein et al, 2003;McCallum and Li, 2003;Whitelaw and Patrick, 2003) performed best on the English development data. Another combination of five systems (Carreras et al, 2003b;Mayfield et al, 2003;McCallum and Li, 2003;Munro et al, 2003;Zhang and Johnson, 2003) obtained the best result for the German development data. We have performed a majority vote with these sets of systems on the related test sets and obtained F β=1 rates of 90.30 for English (14% error reduction compared with the best system) and 74.17 for German (6% error reduction).…”
Section: Performancesmentioning
confidence: 99%
“…It is rooted on the theory of stochastic discrimination (Kleinberg 2000). The random subspace method has been successfully applied to different problems (Munro et al 2003;Hall et al 2003). …”
Section: Ensemble Construction Using and Artificial Immune Systemmentioning
confidence: 99%