Proceedings of the 1st Workshop on Sense, Concept and Entity Representations and Their Applications 2017
DOI: 10.18653/v1/w17-1901
|View full text |Cite
|
Sign up to set email alerts
|

Compositional Semantics using Feature-Based Models from WordNet

Abstract: This article describes a method to build semantic representations of composite expressions in a compositional way by using WordNet relations to represent the meaning of words. The meaning of a target word is modelled as a vector in which its semantically related words are assigned weights according to both the type of the relationship and the distance to the target word. Word vectors are compositionally combined by syntactic dependencies. Each syntactic dependency triggers two complementary compositional funct… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
3
2

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(5 citation statements)
references
References 31 publications
(51 reference statements)
0
5
0
Order By: Relevance
“…Different works have tried to introduce sense representations in the context of compositionality (Köper & im Walde, 2017;Kober, Weeds, Wilkie, Reffin, & Weir, 2017), with different degrees of success. The main idea is to select the intended sense of a word and only introduce that specific meaning into the composition, either through context-based sense induction (Thater, Fürstenau, & Pinkal, 2011), exemplar-based representation (Reddy, Klapaftis, McCarthy, & Manandhar, 2011), or with the help of external resources, such as WordNet (Gamallo & Pereira-Fariña, 2017). An example of the first type of approach can be found in Cheng and Kartsaklis (2015), where a recurrent neural network in which word embeddings were split into multiple sense vectors was proposed.…”
Section: Compositionalitymentioning
confidence: 99%
“…Different works have tried to introduce sense representations in the context of compositionality (Köper & im Walde, 2017;Kober, Weeds, Wilkie, Reffin, & Weir, 2017), with different degrees of success. The main idea is to select the intended sense of a word and only introduce that specific meaning into the composition, either through context-based sense induction (Thater, Fürstenau, & Pinkal, 2011), exemplar-based representation (Reddy, Klapaftis, McCarthy, & Manandhar, 2011), or with the help of external resources, such as WordNet (Gamallo & Pereira-Fariña, 2017). An example of the first type of approach can be found in Cheng and Kartsaklis (2015), where a recurrent neural network in which word embeddings were split into multiple sense vectors was proposed.…”
Section: Compositionalitymentioning
confidence: 99%
“…In current work, we are defining richer semantic word models by combining WordNet features with semantic spaces based on distributional contexts (Gamallo and Pereira-Fariña, 2017). This hybrid method might also help overcome scarcity.…”
Section: Discussionmentioning
confidence: 99%
“…The resultant ontology of the proposed approach appeared to be reliable and was also verified by software engineers. Gamallo and Pereira-Farina [20] used WordNet knowledge structure for OL. Different WordNet relation types, such as synset and hypernyms, are exploited to learn the vocabulary of ontology.…”
Section: B Knowledge-based Ol Techniquesmentioning
confidence: 99%