2021
DOI: 10.3233/ida-194890
|View full text |Cite
|
Sign up to set email alerts
|

Energy modeling of Hoeffding tree ensembles

Abstract: Energy consumption reduction has been an increasing trend in machine learning over the past few years due to its socio-ecological importance. In new challenging areas such as edge computing, energy consumption and predictive accuracy are key variables during algorithm design and implementation. State-of-the-art ensemble stream mining algorithms are able to create highly accurate predictions at a substantial energy cost. This paper introduces the nmin adaptation method to ensembles of Hoeffding tree algorithms,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(11 citation statements)
references
References 32 publications
(44 reference statements)
0
9
0
Order By: Relevance
“…In [9], the authors presented an energy-efficient approach to real-time prediction with high levels of accuracy called nmin adaptation, which reduces the energy consumption of Hoeffding Trees ensembles by adapting the number of instances required for a node split. This method can reduce energy consumption by 21% on average with a small impact on accuracy.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations
“…In [9], the authors presented an energy-efficient approach to real-time prediction with high levels of accuracy called nmin adaptation, which reduces the energy consumption of Hoeffding Trees ensembles by adapting the number of instances required for a node split. This method can reduce energy consumption by 21% on average with a small impact on accuracy.…”
Section: Related Workmentioning
confidence: 99%
“…Finally, in essence, mini-batching consists of applying a loop-exchange optimization (by iterating over the ensembles in the outermost loop and then over the data instances in the innermost loop). As mini-batching does not change the implementation of the individual learners, it can wrap around other individual learners, along with their algorithm-specific optimizations (e.g., in [19], [9], [17], [16]. This, however, is out of this work's scope.…”
Section: Dynamic Power Management Strategiesmentioning
confidence: 99%
See 2 more Smart Citations
“…It relies on the fact that we frequently can choose an attribute that is optimal for splitting (from a small sample). The basic idea is derived from the Hoeffding bound [25].…”
Section: Hoeffding Treementioning
confidence: 99%