Proceedings of the ACL-02 Conference on Empirical Methods in Natural Language Processing - EMNLP '02 2002
DOI: 10.3115/1118693.1118703
|View full text |Cite
|
Sign up to set email alerts
|

Kernel methods for relation extraction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
674
1
5

Year Published

2006
2006
2015
2015

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 517 publications
(681 citation statements)
references
References 20 publications
1
674
1
5
Order By: Relevance
“…into a vector space F , called feature space, and searching for linear relation in the feature space. This embedding is defined implicitly, by specifying an inner product for the feature space via a symmetric and positive semidefinite kernel function: K(x, y) = Φ(x), Φ(y) , where Φ(x) and Φ(y) are the embeddings of data items, x and y [8].…”
Section: Kernel Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…into a vector space F , called feature space, and searching for linear relation in the feature space. This embedding is defined implicitly, by specifying an inner product for the feature space via a symmetric and positive semidefinite kernel function: K(x, y) = Φ(x), Φ(y) , where Φ(x) and Φ(y) are the embeddings of data items, x and y [8].…”
Section: Kernel Methodsmentioning
confidence: 99%
“…, a function defined as a dot product of the corresponding feature vectors is necessarily a kernel function [8].…”
Section: Kernel Methodsmentioning
confidence: 99%
“…The question of whether to use a shallow or deep level of text processing is determined by the design of experiment(s) and algorithm of machine learning. If shallow processing is sufficient, then there is no need to use deep processing (Zelenko et al, 2003).…”
Section: Literature Reviewmentioning
confidence: 99%
“…Subsequently, the generated patterns can be applied to new contexts with unknown relations and derive meaningful relations. Commonly used machine learning models include the support vector machine (SVM) (Bunescu & Mooney, 2007;Culotta & Sorensen, 2004;Zelenko et al, 2003), clustering (Agichtein & Gravano, 2000, undirected graphical models (Culotta et al, 2006), and decision tree (Nahm & Mooney, 2000).…”
Section: Literature Reviewmentioning
confidence: 99%
“…They proposed a general language modeling method for quantifying the difficulty of information extraction by predicting performance of named entity recognition such as location, organization, person name and miscellaneous named entities, and relation extraction such as birth dates, death dates and invention name. Zelenko et al [24] proposed kernel methods with support vector machines (SVMs) for extracting relation among personaffiliation and organization-location. Culotta et al [21] experimented on the Automatic Content Extraction (ACE) corpus using different features such as WordNet, parts of speech and entity types.…”
Section: Introductionmentioning
confidence: 99%