2017
DOI: 10.1007/978-3-319-68288-4_41
|View full text |Cite
|
Sign up to set email alerts
|

Towards Holistic Concept Representations: Embedding Relational Knowledge, Visual Attributes, and Distributional Word Semantics

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
18
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 26 publications
(19 citation statements)
references
References 25 publications
1
18
0
Order By: Relevance
“…A high purity means that resources with similar type vectors (e.g., presidents who are also authors) are located close to each other in the embedding space, which is a wanted characteristic of a KGE. In our second evaluation, we performed a type prediction experiment in a manner akin to [10,17]. For each resource x ∈ S, we used the µ closest embeddings of x to predict x's type vector.…”
Section: Time Complexitymentioning
confidence: 99%
“…A high purity means that resources with similar type vectors (e.g., presidents who are also authors) are located close to each other in the embedding space, which is a wanted characteristic of a KGE. In our second evaluation, we performed a type prediction experiment in a manner akin to [10,17]. For each resource x ∈ S, we used the µ closest embeddings of x to predict x's type vector.…”
Section: Time Complexitymentioning
confidence: 99%
“…also fused text-based representations with imagebased representations (Bruni et al, 2014;Lazaridou et al, 2015;Chrupala et al, 2015;Mao et al, 2016;Silberer et al, 2017;Collell et al, 2017;Zablocki et al, 2018) and representations derived from a knowledge-graph (Thoma et al, 2017). More recently, gating-based approaches have been developed for fusing traditional word embeddings with visual representations.…”
Section: Related Workmentioning
confidence: 99%
“…There are notable examples showcasing the influence of neural approaches to knowledge acquisition and representation learning on the broad area of Semantic Web technologies. These include, among oth-ers, ontology learning [40,49,65], learning structured query languages from natural language [69], ontology alignment [20,28,35,52], ontology annotation [15,58], joined relational and multi-modal knowledge representations [62], and relation prediction [1,59]. Ontologies, on the other hand, have been repeatedly utilized as background knowledge for machine learning tasks.…”
Section: Introductionmentioning
confidence: 99%
“…As an example, there is a myriad of hybrid approaches for learning linguistic representations by jointly incorporating corpus-based evidence and semantic resources [13,25,27,33,50]. This interplay between structured knowledge and corpus-based approaches has given way to knowledge graph embeddings, which in turn have proven useful for tasks such as hypernym discovery [21], collocation discovery and classification [22], word sense disambiguation [12,54], joined relational and multi-modal knowledge representations [62] and many others.…”
Section: Introductionmentioning
confidence: 99%