The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
Proceedings of the 19th International Conference on Computational Linguistics - 2002
DOI: 10.3115/1072228.1072289
|View full text |Cite
|
Sign up to set email alerts
|

Antonymy and conceptual vectors

Abstract: For meaning representations in NLP, we focus our attention on thematic aspects and conceptual vectors. The learning strategy of conceptual vectors relies on a morphosyntaxic analysis of human usage dictionary definitions linked to vector propagation. This analysis currently doesn't take into account negation phenomena. This work aims at studying the antonymy aspects of negation, in the larger goal of its integration into the thematic analysis. We present a model based on the idea of symmetry compatible with co… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
17
0

Year Published

2008
2008
2013
2013

Publication Types

Select...
2
2
1

Relationship

1
4

Authors

Journals

citations
Cited by 11 publications
(17 citation statements)
references
References 1 publication
0
17
0
Order By: Relevance
“…It has been shown previously [7,[16][17][18][19]] that a conceptual vector can be improved by the effects of lexical functions, and also that the lexical function data themselves are significantly enhanced by the use of lexical information and of the corresponding vectors. Hence, the two processes of learning of lexical functions and the construction of new vectors are mutually benefiting each other, each feeding their outputs into the inputs of the other.…”
Section: Towards a Society Of Agentsmentioning
confidence: 99%
“…It has been shown previously [7,[16][17][18][19]] that a conceptual vector can be improved by the effects of lexical functions, and also that the lexical function data themselves are significantly enhanced by the use of lexical information and of the corresponding vectors. Hence, the two processes of learning of lexical functions and the construction of new vectors are mutually benefiting each other, each feeding their outputs into the inputs of the other.…”
Section: Towards a Society Of Agentsmentioning
confidence: 99%
“…How can one introduce synonymy and antonymy into the geometrical framework outlined above in which only the notion of dissimilarity is initially assumed to be represented by distances? To the best of our knowledge, this is an open problem [13].…”
Section: Kinds and Properties Of Semantic Cognitive Mapsmentioning
confidence: 99%
“…How can one introduce synonymy and antonymy into the geometrical framework outlined above in which only the notion of dissimilarity is initially assumed to be represented by distances? To the best of our knowledge, this is an open problem [13].Another issue concerns the semantics of the coordinates of the ambient space in which the map image lives. A recognized limitation of LSA and related approaches is that semantic dimensions cannot be clearly identified: ''The typical feature of LSA is that dimensions are latent.…”
mentioning
confidence: 99%
“…As an example, we expose only the 'complementary' antonymy proposed by [20]: The same method is used for the other types. Complementary antonyms are couples like event/unevent, presence/absence.…”
Section: Two Lexical Items Are In Antonymy Relation If There Is a Symmentioning
confidence: 99%
“…The vector approach is completely at the at the opposite. Offering very easily thematic association, it allows many fine-grained synonymy [7] and antonymy [20] functions to be defined and implemented, but is unable to differentiate or to valid the existence of hyperonymous relations.…”
Section: Introductionmentioning
confidence: 99%