2014
DOI: 10.1109/tkde.2013.183
|View full text |Cite
|
Sign up to set email alerts
|

Motif-Based Hyponym Relation Extraction from Wikipedia Hyperlinks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
2
2
2

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 20 publications
0
4
0
Order By: Relevance
“…A conjunctive query on trees and relations can find sentences to satisfy specific semantic structure by relations (e.g. hyponym relations [37]) and corpora trees [41].…”
Section: Queries In Computational Linguisticsmentioning
confidence: 99%
“…A conjunctive query on trees and relations can find sentences to satisfy specific semantic structure by relations (e.g. hyponym relations [37]) and corpora trees [41].…”
Section: Queries In Computational Linguisticsmentioning
confidence: 99%
“…As shown in Fig. 8, Randgraf-4-2 has 2, 2, 2, 4, 2, and 6 ways to sample a 4-node CIS s including v in orbits 6,9,10,12,13, and 14 respectively. Each way happens with probability β…”
Section: Proof Of Theoremmentioning
confidence: 99%
“…In fact, the graphlet orbit degree signature has been successfully used for protein function prediction [2] and cancer gene identification [3] by identifying groups (or clusters) of topologically similar nodes in biological networks. In addition to biological networks, graphlet orbit degree is also used for link prediction [4] and node classification [5] in online social networks, and hyponym relation extraction from Wikipedia hyperlinks [6].…”
Section: Introductionmentioning
confidence: 99%
“…ii) The accuracy of free-text taxonomies is usually lower than many Wikipedia-based taxonomies because it is difficult to extract knowledge completely from texts; iii) The task of taxonomy learning is still insufficiently studied a) in emerging and specific domains and b) for non-English or underresourced languages (Wei et al, 2014;Alfarone and Davis, 2015;.…”
Section: Introductionmentioning
confidence: 99%