Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021 2021
DOI: 10.18653/v1/2021.findings-acl.296
|View full text |Cite
|
Sign up to set email alerts
|

Representing Syntax and Composition with Geometric Transformations

Abstract: The exploitation of syntactic graphs (SyGs) as a word's context has been shown to be beneficial for distributional semantic models (DSMs), both at the level of individual word representations and in deriving phrasal representations via composition. However, notwithstanding the potential performance benefit, the syntactically-aware DSMs proposed to date have huge numbers of parameters (compared to conventional DSMs) and suffer from data sparsity. Furthermore, the encoding of the SyG links (i.e., the syntactic r… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 31 publications
0
1
0
Order By: Relevance
“…To allow Skip-Gram to use only one vector per word,Zobnin and Elistratova (2019) propose using an indefinite inner product, which corresponds to T in (9) being a diagonal matrix of 1s and −1s. In a similar vein,Bertolini et al (2021) propose a more radical simplification of the Dependency Matrix model, which uses matrices that are non-zero only on the diagonal and off-diagonal.…”
mentioning
confidence: 99%
“…To allow Skip-Gram to use only one vector per word,Zobnin and Elistratova (2019) propose using an indefinite inner product, which corresponds to T in (9) being a diagonal matrix of 1s and −1s. In a similar vein,Bertolini et al (2021) propose a more radical simplification of the Dependency Matrix model, which uses matrices that are non-zero only on the diagonal and off-diagonal.…”
mentioning
confidence: 99%