2021
DOI: 10.48550/arxiv.2111.01760
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Increasing Liquid State Machine Performance with Edge-of-Chaos Dynamics Organized by Astrocyte-modulated Plasticity

Abstract: The liquid state machine (LSM) combines low training complexity and biological plausibility, which has made it an attractive machine learning framework for edge and neuromorphic computing paradigms. Originally proposed as a model of brain computation, the LSM tunes its internal weights without backpropagation of gradients, which results in lower performance compared to multi-layer neural networks. Recent findings in neuroscience suggest that astrocytes, a long-neglected non-neuronal brain cell, modulate synapt… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 66 publications
0
1
0
Order By: Relevance
“…To realize the full potential of this framework, described here are several topics for future research, including: incorporating cell-type specific neuromodulatory signals (Liu et al, 2021) into the learning process; exploring the addition of glial cell dynamics (Gordleeva et al, 2021;Ivanov and Michmizos, 2021); providing deeper insight into the learning capabilities of different plasticity rules in the neuroscience literature, such as the widerange of existing voltage-dependent plasticity rules, rate-based plasticity rules, and spike-timing dependent plasticity rules; and exploring the use of this framework on robotic and reinforcement learning experiments. Another direction might explore learning the neural architecture in conjunction with the plasticity parameters, since architecture is known to play a significant role in the function of neural dynamics (Gaier and Ha, 2019).…”
Section: Discussionmentioning
confidence: 99%
“…To realize the full potential of this framework, described here are several topics for future research, including: incorporating cell-type specific neuromodulatory signals (Liu et al, 2021) into the learning process; exploring the addition of glial cell dynamics (Gordleeva et al, 2021;Ivanov and Michmizos, 2021); providing deeper insight into the learning capabilities of different plasticity rules in the neuroscience literature, such as the widerange of existing voltage-dependent plasticity rules, rate-based plasticity rules, and spike-timing dependent plasticity rules; and exploring the use of this framework on robotic and reinforcement learning experiments. Another direction might explore learning the neural architecture in conjunction with the plasticity parameters, since architecture is known to play a significant role in the function of neural dynamics (Gaier and Ha, 2019).…”
Section: Discussionmentioning
confidence: 99%