2019 IEEE International Conference on Big Data (Big Data) 2019
DOI: 10.1109/bigdata47090.2019.9006239
|View full text |Cite
|
Sign up to set email alerts
|

Evolving Energy Efficient Convolutional Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
2
1
1

Relationship

2
5

Authors

Journals

citations
Cited by 9 publications
(5 citation statements)
references
References 20 publications
0
5
0
Order By: Relevance
“…We believe that the incredible sparsity of biological connectomes ( 3 . 2% for the C. Elegans , Cook et al (2019)) will significantly decrease the amount of energy necessary to deploy deep learning models, perhaps through the use of neuromorphic hardware (Young et al (2019); Zhu et al (2020); Schuman et al (2022)). These connectomes may lead to more robust and resilient neural systems that sidestep many of the adversarial drawbacks of highly general network structures (Guo et al (2018); Schuman et al (2020)).…”
Section: Discussion and Future Workmentioning
confidence: 99%
“…We believe that the incredible sparsity of biological connectomes ( 3 . 2% for the C. Elegans , Cook et al (2019)) will significantly decrease the amount of energy necessary to deploy deep learning models, perhaps through the use of neuromorphic hardware (Young et al (2019); Zhu et al (2020); Schuman et al (2022)). These connectomes may lead to more robust and resilient neural systems that sidestep many of the adversarial drawbacks of highly general network structures (Guo et al (2018); Schuman et al (2020)).…”
Section: Discussion and Future Workmentioning
confidence: 99%
“…As such, it is important to train with those objectives in mind. In previous work, we have extended the Bayesian optimization approach [5], [9] and the fitness function used within MENNDL [50] to incorporate multiple objectives. In future work, we plan to apply this approach to the Whetstone algorithm in order to optimize networks that are both more accurate, but also more efficient.…”
Section: Discussionmentioning
confidence: 99%
“…Firstly, it directly relates to the model's size in storage, making it crucial for deployment in storage-constrained devices like smartphones or embedded devices [33], [45]. Secondly, a model with fewer parameters is computationally efficient in training and inference, making it more timeefficient [46]. Additionally, lightweight models are more energy-efficient, a critical factor for battery-powered devices [47].…”
Section: Datasets and Evaluation Metrics A Efficiency Evaluation Metricsmentioning
confidence: 99%