Proceedings of the International Conference on Web Intelligence 2017
DOI: 10.1145/3106426.3106465
|View full text |Cite
|
Sign up to set email alerts
|

Large-scale taxonomy induction using entity and word embeddings

Abstract: Taxonomies are an important ingredient of knowledge organization, and serve as a backbone for more sophisticated knowledge representations in intelligent systems, such as formal ontologies. However, building taxonomies manually is a costly endeavor, and hence, automatic methods for taxonomy induction are a good alternative to build large-scale taxonomies. In this paper, we propose TIEmb, an approach for automatic unsupervised class subsumption axiom extraction from knowledge bases using entity and text embeddi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0
2

Year Published

2018
2018
2023
2023

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 13 publications
(9 citation statements)
references
References 22 publications
0
6
0
2
Order By: Relevance
“…Other techniques have been proposed that use large knowledge graphs to build taxonomies [40]. Ristoski et al [39] combine class labels with vector space embeddings to build the taxonomy. Their method is based on the assumption that instances of a more specific class should be positioned closer to each other on average than instances of a broader class.…”
Section: Distribution-based Hypernym Derivationmentioning
confidence: 99%
See 2 more Smart Citations
“…Other techniques have been proposed that use large knowledge graphs to build taxonomies [40]. Ristoski et al [39] combine class labels with vector space embeddings to build the taxonomy. Their method is based on the assumption that instances of a more specific class should be positioned closer to each other on average than instances of a broader class.…”
Section: Distribution-based Hypernym Derivationmentioning
confidence: 99%
“…Taxonomies generated in that way are mostly built for a certain text where the algorithm first identifies the key concepts of the text and then finds a hierarchical relationship between these concepts. Similar word contexts are used to aid the discovery of a relation between concepts [39]. Finally, the different concepts are arranged into a hierarchical structure.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Penyelidikan terkini seperti Ristoski et al (2017), Cimiano (2016) dan Anh et al (2017) mencadangkan teknik baharu dalam membangunkan taksonomi secara automatik dari teks Bahasa Inggeris. Namun kaedah dan teknik yang dibangunkan hanya tertumpu kepada teks berbahasa Inggeris.…”
Section: Sorotan Kajian Pembelajaran Taksonomi Dari Teksunclassified
“…While such vector spaces capture general semantic relatedness, their well-known limitation is the inability to indicate the exact nature of the semantic relation that holds between words. Yet, the ability to recognize the exact semantic relation between words is crucial for many NLP applications: taxonomy induction (Fu et al, 2014;Ristoski et al, 2017), natural language inference (Tatu and Moldovan, 2005;Chen et al, 2017), text simplification (Glavaš anď Stajner, 2015), and paraphrase generation (Madnani and Dorr, 2010), to name a few. This is why numerous methods have been proposed that either (1) specialize distributional vectors to better reflect a particular relation (most commonly synonymy) (Faruqui et al, 2015;Kiela et al, 2015; or (2) train supervised relation classifiers using lexico-semantic relations (i.e., labeled word pairs) from external resources such as WordNet (Fellbaum, 1998) as training data (Baroni et al, 2012;Roller et al, 2014;Glavaš and Ponzetto, 2017).…”
Section: Introductionmentioning
confidence: 99%