2017 IEEE Global Conference on Signal and Information Processing (GlobalSIP) 2017
DOI: 10.1109/globalsip.2017.8309078
|View full text |Cite
|
Sign up to set email alerts
|

Context-aware feature query to improve the prediction performance

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
16
0

Year Published

2019
2019
2020
2020

Publication Types

Select...
2
1
1

Relationship

2
2

Authors

Journals

citations
Cited by 4 publications
(16 citation statements)
references
References 9 publications
0
16
0
Order By: Relevance
“…Furthermore, the devised method uses backpropagation of gradients and binary representation layers in neural networks to address the computational load as well as scalability concerns. In an earlier work [22], we introduced the idea of sensitivity analysis as a method for dynamic feature selection. However, in this paper, we extend the idea by considering feature acquisition costs, introducing improvements such as feature encoding, and conducting more detailed experiments and analysis.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Furthermore, the devised method uses backpropagation of gradients and binary representation layers in neural networks to address the computational load as well as scalability concerns. In an earlier work [22], we introduced the idea of sensitivity analysis as a method for dynamic feature selection. However, in this paper, we extend the idea by considering feature acquisition costs, introducing improvements such as feature encoding, and conducting more detailed experiments and analysis.…”
Section: Related Workmentioning
confidence: 99%
“…Additionally, in this table, the area under the accuracy-cost curves (AUACC) of the proposed feature query method as well as the AUACC of randomly asking for the unknown features are presented. We have also included AUACC results of a cost-aware version of the method suggested in [22] in which sensitivities are normalized by feature costs (see the DPFQ column). It is worth mentioning that AUACC values are calculated as the normalized area under the accuracy versus the acquisition cost curve from a cost of zero to the cost at which the accuracy converges to the maximum accuracy.…”
Section: B Performance Evaluationmentioning
confidence: 99%
See 1 more Smart Citation
“…As an alternative approach, sensitivity analysis of trained predictors is suggested to measure the importance of each feature given a context (Early et al, 2016a;Kachuee et al, 2017;. These approaches either require an exhaustive measurement of sensitivities or rely on approximations of sensitivity.…”
Section: Introductionmentioning
confidence: 99%
“…With the result, the hidden layer learns to build a smaller representation of the input. Examples of IoT use-cases that utilized AEs include human activity recognition [113], privacy preservation in sensor data analytics [114], prediction performance improvement in sensor and wearable systems [115], botnet traffic detection [116], fault diagnosis [117], etc.…”
mentioning
confidence: 99%