2019
DOI: 10.1101/596726
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Distributed Code for Semantic Relations Predicts Neural Similarity

Abstract: The ability to generate and process semantic relations is central to many aspects of human cognition. Theorists have long debated whether such relations are coded as atomistic links in a semantic network, or as distributed patterns over some core set of abstract relations.The form and content of the conceptual and neural representations of semantic relations remains to be empirically established. The present study combined computational modeling and neuroimaging to investigate the representation and comparison… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2020
2020
2020
2020

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 39 publications
0
3
0
Order By: Relevance
“…A recent model of relational learning and reasoning, the Bayesian Analogy with Relational Transformations model (BART; Chiang et al, 2019;Lu et al, 2012Lu et al, , 2019b implements both distributed role-based representations, as in Hummel and Holyoak's LISA model (1997), but also distributed representations of relational features, as in Chaffin's (1992) relational element theory. BART learns semantic relations from exemplars of those relations by associating a relational vector and two feature vectors representing the two words by learning a set of relational weights that connects them (Figure 3).…”
Section: Relational Vs Role Similaritymentioning
confidence: 99%
See 2 more Smart Citations
“…A recent model of relational learning and reasoning, the Bayesian Analogy with Relational Transformations model (BART; Chiang et al, 2019;Lu et al, 2012Lu et al, , 2019b implements both distributed role-based representations, as in Hummel and Holyoak's LISA model (1997), but also distributed representations of relational features, as in Chaffin's (1992) relational element theory. BART learns semantic relations from exemplars of those relations by associating a relational vector and two feature vectors representing the two words by learning a set of relational weights that connects them (Figure 3).…”
Section: Relational Vs Role Similaritymentioning
confidence: 99%
“…While there have been theoretical debates concerning whether or not individual relations are represented explicitly, i.e. independently of the concepts they connect (Doumas & Hummel, 2005; for a review, see Popov, Hristova, & Anders, 2017), empirical research has only recently started to address these questions systematically (Asmuth & Gentner, 2016;Chiang, Peng, Lu, Holyoak, & Monti, 2019;Estes & Jones, 2006;Gagné, Spalding, & Ji, 2005;Popov et al, 2017).…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation