2022
DOI: 10.1101/2022.10.10.511448
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Emergent neural dynamics and geometry for generalization in a transitive inference task

Abstract: The ability to make inferences using abstract rules and relations has long been understood to be a hallmark of human intelligence, as evidenced in logic, mathematics, and language. Intriguingly, modern work in animal cognition has established that this ability is evolutionarily widespread, indicating an ancient and possibly foundational role in natural intelligence. Despite this importance, it remains an open question how inference using abstract rules is implemented in the brain - possibly due to a lack of co… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

2
1

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 170 publications
(236 reference statements)
0
4
0
Order By: Relevance
“…With respect to similarity-based relational learning models, our account could be seen as an endpoint to a series of investigations of TI behavior: expanding upon previous studies [30,53,54], we show comprehensively that the principle of norm minimization enables any model with partly conjunctive representations to generalize transitively and further gives rise to naturalistic behavior on TI. Importantly, norm minimization is implicated not only in gradient flow and ridge regression (the examples we consider), but also a much broader range of learning models [129], including reinforcement learning [130].…”
Section: Discussionmentioning
confidence: 70%
See 2 more Smart Citations
“…With respect to similarity-based relational learning models, our account could be seen as an endpoint to a series of investigations of TI behavior: expanding upon previous studies [30,53,54], we show comprehensively that the principle of norm minimization enables any model with partly conjunctive representations to generalize transitively and further gives rise to naturalistic behavior on TI. Importantly, norm minimization is implicated not only in gradient flow and ridge regression (the examples we consider), but also a much broader range of learning models [129], including reinforcement learning [130].…”
Section: Discussionmentioning
confidence: 70%
“…7b,e, green line). Our account may therefore be able to explain why previous studies [53,54] found empirically that feedforward ... neural networks generalize transitively. Surprisingly, we found that networks trained in the rich regime performed worse at TI.…”
Section: Neural Network With Adaptive Representations Deviate From Li...mentioning
confidence: 70%
See 1 more Smart Citation
“…To investigate potential brain mechanisms of transitive inference, it is also possible to train artificial neural networks on a given ordered list, and observe their behavior [De Lillo et al, 2001, Kay et al, 2023, Nelli et al, 2023]. Trained networks often exhibit transitive inference over the learned lists, and their neural dynamics and representational geometry can be analyzed directly.…”
Section: Introductionmentioning
confidence: 99%