2022
DOI: 10.3389/fncom.2022.836498
|View full text |Cite
|
Sign up to set email alerts
|

Symmetry-Based Representations for Artificial and Biological General Intelligence

Abstract: Biological intelligence is remarkable in its ability to produce complex behavior in many diverse situations through data efficient, generalizable, and transferable skill acquisition. It is believed that learning “good” sensory representations is important for enabling this, however there is little agreement as to what a good representation should look like. In this review article we are going to argue that symmetry transformations are a fundamental principle that can guide our search for what makes a good repr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
29
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 31 publications
(34 citation statements)
references
References 115 publications
0
29
0
Order By: Relevance
“…In the present context of dynamical systems, propagators correspond to a group of symmetries acting upon the set of distributions of states in the system. This perspective highlights connections to recent work in unsupervised learning seeking to extract disentangled representations from a given source of data [ 52 , 53 ], which coalesced around the concept of identifying independent symmetries within a dataset [ 54 , 55 , 56 ]. With respect to our work, each of these symmetries would be identified with a particular generator and associated grid module, which could then be generatively composed in the EHC architectures we have outlined.…”
Section: Discussionmentioning
confidence: 78%
“…In the present context of dynamical systems, propagators correspond to a group of symmetries acting upon the set of distributions of states in the system. This perspective highlights connections to recent work in unsupervised learning seeking to extract disentangled representations from a given source of data [ 52 , 53 ], which coalesced around the concept of identifying independent symmetries within a dataset [ 54 , 55 , 56 ]. With respect to our work, each of these symmetries would be identified with a particular generator and associated grid module, which could then be generatively composed in the EHC architectures we have outlined.…”
Section: Discussionmentioning
confidence: 78%
“…In machine learning, the ability to represent a task-relevant feature invariant to others in the environment can lead to efficient performance of high-level tasks. For instance, object representations that are invariant to illumination or orientation improve object recognition performance (57). In broader cognitive operations, concept abstraction allows behavior to be more flexible by adapting to different scenarios and transferring to new tasks.…”
Section: Discussionmentioning
confidence: 99%
“…Learning invariant representations can be accomplished by decomposing inputs into separate generative factors, such that each factor’s representation lies in its own subspace that is invariant to transformations defined by the others. This process is referred to as ‘untangling’ or ‘disentangling’ and has been identified as part of the solution to many complex tasks in neuroscience and machine learning (60, 61, 62, 63, 64). For instance, disentangled representations of cue valence and response direction in our task would mean that the transformation of moving in the neural activity space from left to right motor responses should not affect the representation of cue valence, which is consistent with our data.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…The distributions are referred to as the factors f t in this work. The reason we model the factors as distributions is that the loss function of variational autoencoders [26] has been shown to encourage disentanglement of the separate factors in each distribution [3,[16][17][18].…”
Section: Biological Datamentioning
confidence: 99%