2020
DOI: 10.1007/s00500-020-05302-y
|View full text |Cite
|
Sign up to set email alerts
|

Artificial neural networks training acceleration through network science strategies

Abstract: The development of deep learning has led to a dramatic increase in the number of applications of artificial intelligence. However, the training of deeper neural networks for stable and accurate models translates into artificial neural networks (ANNs) that become unmanageable as the number of features increases. This work extends our earlier study where we explored the acceleration effects obtained by enforcing, in turn, scale freeness, small worldness, and sparsity during the ANN training process. The efficien… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 22 publications
(27 reference statements)
0
3
0
1
Order By: Relevance
“…To ensure the success of the training, different parts of the data must be examined. The training efficiency of neural network models is dependent on the data and the learning algorithm adopted [47,48]. Advanced learning algorithms effectively transfer intelligence to the ANN from historical facts or data.…”
Section: Training Optimizationmentioning
confidence: 99%
“…To ensure the success of the training, different parts of the data must be examined. The training efficiency of neural network models is dependent on the data and the learning algorithm adopted [47,48]. Advanced learning algorithms effectively transfer intelligence to the ANN from historical facts or data.…”
Section: Training Optimizationmentioning
confidence: 99%
“…YSA'nın diğer bir temel öğesi olan Aktivasyon Fonksiyonu bir değişkeni farklı bir boyuta taşıyan doğrusal veya doğrusal olmayan bir fonksiyondur. YSA'da Aktivasyon Fonksiyonu yapay sinir hücresi girdi verileri üzerinde işlem yaparak buna karşılık gelen net çıktı sonuçları elde eder [13][14][15][16][17][18][19][20].…”
Section: B Yapay Si̇ni̇r Ağlariunclassified
“…Two main partially overlapping groups of papers can be distinguished within it. The first group discusses optimization problems and algorithms including local, global, and multi-criteria optimization (see Franchini et al 2020;Ž ilinskas and Litvinas 2020;Nesterov 2020;Posypkin et al 2020;Shao et al 2020;De Leone et al 2020;Capuano et al 2020;Lančinskas et al 2020;Sergeyev et al 2020;Crisci et al 2020;Cavallaro et al 2020;Astorino and Fuduli 2020;Candelieri et al 2020). The second group of papers (see D'Alotto 2020; Pepelyshev and Zhigljavsky 2020;Falcone et al 2020;Amodio et al 2020;Gangle et al 2020;De Leone et al 2020;Astorino and Fuduli 2020) deals with problems and algorithms using the already mentioned recent computational framework allowing one to work with different infinities and infinitesimals numerically.…”
mentioning
confidence: 99%