2015
DOI: 10.1016/j.neucom.2014.11.077
|View full text |Cite
|
Sign up to set email alerts
|

Optimal resource usage in ultra-low-power sensor interfaces through context- and resource-cost-aware machine learning

Abstract: This paper introduces an approach that combines machine learning and adaptive hardware to improve the e ciency of ultra-low-power sensor interfaces. Adaptive feature extraction circuits are assisted by hardware embedded training to dynamically activate only the most relevant features. This selection is done in a context-and power cost-aware manner, through modification of the C4.5 algorithm. As proof-of-principle, a Voice Activity Detector illustrates the contextdependent relevance of features, demonstrating a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
4
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 22 publications
0
4
0
Order By: Relevance
“…The extracted feature subset, af 5 -af 12 , is passed on to the on-chip classifier (Fig. 5) while the complete feature-set af 1 -af 16 can be passed on to an off-chip ADC for more complex information extraction, such as context-change detection and retraining the classifier as in [22].…”
Section: Decision Tree Based Classifiermentioning
confidence: 99%
See 3 more Smart Citations
“…The extracted feature subset, af 5 -af 12 , is passed on to the on-chip classifier (Fig. 5) while the complete feature-set af 1 -af 16 can be passed on to an off-chip ADC for more complex information extraction, such as context-change detection and retraining the classifier as in [22].…”
Section: Decision Tree Based Classifiermentioning
confidence: 99%
“…-af 12 . The selected feature (sf i ) is then compared with a reference voltage (Vref i ) determined by a modified C4.5 machine learning algorithm [22], generating the output decision b i of each node: ,…”
Section: Decision Tree Based Classifiermentioning
confidence: 99%
See 2 more Smart Citations