2020
DOI: 10.1038/s41598-020-63755-5
|View full text |Cite|
|
Sign up to set email alerts
|

Brain experiments imply adaptation mechanisms which outperform common AI learning algorithms

Abstract: Attempting to imitate the brain's functionalities, researchers have bridged between neuroscience and artificial intelligence for decades; however, experimental neuroscience has not directly advanced the field of machine learning (ML). Here, using neuronal cultures, we demonstrate that increased training frequency accelerates the neuronal adaptation processes. This mechanism was implemented on artificial neural networks, where a local learning step-size increases for coherent consecutive learning steps, and tes… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 9 publications
(5 citation statements)
references
References 19 publications
0
5
0
Order By: Relevance
“…Accelerated strategy: power-law with many epochs. An accelerated BP method is based on a recent new bridge between experimental neuroscience and advanced artificial intelligence learning algorithms, in which an increased training frequency has been able to significantly accelerate neuronal adaptation processes 24 . This accelerated brain-inspired mechanism involves time-dependent step-size, η t , associated with each weight, such that coherent consecutive gradients of weight, that is, with the same sign, increase the conjugate η .…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…Accelerated strategy: power-law with many epochs. An accelerated BP method is based on a recent new bridge between experimental neuroscience and advanced artificial intelligence learning algorithms, in which an increased training frequency has been able to significantly accelerate neuronal adaptation processes 24 . This accelerated brain-inspired mechanism involves time-dependent step-size, η t , associated with each weight, such that coherent consecutive gradients of weight, that is, with the same sign, increase the conjugate η .…”
Section: Resultsmentioning
confidence: 99%
“…1a). The presented dataset of examples for the algorithm involves the following initial preprocessing and steps (see Supplementary Appendix A): (a) Balanced set of examples: The small dataset consists of an equal number of random examples per label 24 . (b) Input bias: The bias of each example is subtracted and the standard deviation of its 784 pixels is normalized to unity.…”
mentioning
confidence: 99%
“…Another possible mechanism is the addition of a super-linear number of cross-weights to the filters. This represents a biological realization because cross-weights result as a byproduct of dendritic nonlinear amplification 17,29,34,35 . Nevertheless, these possible enhanced ρ mechanisms significantly increase computational complexity and are mentioned for their potential biological relevance, limited number of layers, and the natural emergence of many cross-weights.…”
Section: Discussionmentioning
confidence: 99%
“…No matter in the early days or now, imitating human cognitive principles has always been the original intention of deep learning (Bezdek 1992;Sardi et al 2020). Initially, the fullyconnected edges designed in artificial neural networks ideally mimic the numerous dendrites of nerve cells.…”
Section: Human Cognitive Modelingmentioning
confidence: 99%