2021
DOI: 10.1037/xlm0001044
|View full text |Cite
|
Sign up to set email alerts
|

The semantics-syntax interface: Learning grammatical categories and hierarchical syntactic structure through semantics.

Abstract: Language is infinitely productive because syntax defines dependencies between grammatical categories of words and constituents, so there is interchangeability of these words and constituents within syntactic structures. Previous laboratory-based studies of language learning have shown that complex language structures like hierarchical center embeddings (HCE) are very hard to learn, but these studies tend to simplify the language learning task, omitting semantics and focusing either on learning dependencies bet… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 59 publications
0
2
0
Order By: Relevance
“…Presently, it does not reflect the hierarchical nature of sentence structures. This means that, to encode sentences like “the dog that chases the cat runs”, the current model needs to duplicate all its syntactic rules for the subordinate clause ( 8 ), and the model’s linear order will not explain rules of agreement ( 9 ). Nor is the model able to simultaneously assign the same word to two different roles, in the example “the little star is beside a big star”.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Presently, it does not reflect the hierarchical nature of sentence structures. This means that, to encode sentences like “the dog that chases the cat runs”, the current model needs to duplicate all its syntactic rules for the subordinate clause ( 8 ), and the model’s linear order will not explain rules of agreement ( 9 ). Nor is the model able to simultaneously assign the same word to two different roles, in the example “the little star is beside a big star”.…”
Section: Discussionmentioning
confidence: 99%
“…We hypothesize that working memory links each word to its role in the sentence, and those roles follow probabilistic orderings. Although this cannot account for true grammar ( 8 ), since it does not support hierarchical structure ( 9 ), it provides a biologically plausible way to understand some aspects of syntax. In this paper we propose a neural architecture that stores and generates syntactic sequences.…”
Section: Main Textmentioning
confidence: 99%