2020
DOI: 10.1007/s00521-020-05136-7
|View full text |Cite
|
Sign up to set email alerts
|

Sparse evolutionary deep learning with over one million artificial neurons on commodity hardware

Abstract: Artificial neural networks (ANNs) have emerged as hot topics in the research community. Despite the success of ANNs, it is challenging to train and deploy modern ANNs on commodity hardware due to the ever-increasing model size and the unprecedented growth in the data volumes. Particularly for microarray data, the very high dimensionality and the small number of samples make it difficult for machine learning techniques to handle. Furthermore, specialized hardware such as graphics processing unit (GPU) is expens… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
45
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 42 publications
(45 citation statements)
references
References 47 publications
(54 reference statements)
0
45
0
Order By: Relevance
“…The efficiency of this approach has also been recently confirmed by independent researchers, who managed to train a million-node ANN on non-specialized laptops (Liu et al 2019).…”
Section: Introductionmentioning
confidence: 77%
See 1 more Smart Citation
“…The efficiency of this approach has also been recently confirmed by independent researchers, who managed to train a million-node ANN on non-specialized laptops (Liu et al 2019).…”
Section: Introductionmentioning
confidence: 77%
“…A special mention goes to recent research in Liu et al (2019), where the authors managed to train a million-node ANN on non-specialized laptops, based on the SET framework that was initially introduce in Mocanu et al (2018). SET is a training procedure in which connections are pruned on the basis of their magnitude, while other connections are randomly added.…”
Section: Methods Derived From Network Science To Induce Sparse Annsmentioning
confidence: 99%
“…We extended the sparse framework presented in Liu et al (2020b) by implementing the theoretical contributions presented in this paper. The initial implementation was sequential and it was not able to obtain the same accuracy as Keras on some datasets such as CIFAR10.…”
Section: Large Scale Sparse Neural Network Frameworkmentioning
confidence: 99%
“…In (Liu et al (2020b)), the authors developed an e icient implementation of sparse multilayer perceptrons (MLPs) trained with SET. For the first time, they built sparse MLP models with over one million artificial neurons on commodity hardware, only utilising one CPU core.…”
Section: Introductionmentioning
confidence: 99%
“…In recent years, a rapidly increasing number of research works are investigating on sparse training of DNN [112][113][114][115][116].…”
Section: 2mentioning
confidence: 99%