-Energy efficiency and positional accuracy are often contradictive goals. We propose to decrease power consumption without sacrificing significant accuracy by developing an energy-aware localization that adapts the sampling rate to target's mobility level. In this paper, an energy-aware adaptive localization system based on signal strength fingerprinting is designed, implemented, and evaluated. Promising to satisfy an application's requirements on positional accuracy, our system tries to adapt its sampling rate to reduce its energy consumption. The contribution of this paper is three-fold. (1) We have developed a model to predict the positional error of a real working positioning engine under different mobility levels of mobile targets, estimation error from the positioning engine, processing and networking delay in the location infrastructure, and sampling rate of location information. (2) In a real test environment, our energy-saving method solves the mobility estimation error problem by utilizing additional sensors on mobile targets. The result is that we can improve the prediction accuracy by as much as 37.01%. (3) We implemented our energy-saving methods inside a working localization infrastructure and conducted performance evaluation in a real office environment. Our performance results show as much as 49.76 % reduction in power consumption.
According to the theory of clothing design, the genres of clothes can be recognized based on a set of visually differentiable style elements, which exhibit salient features of visual appearance and reflect high-level fashion styles for better describing clothing genres. Instead of using less-discriminative low-level features or ambiguous keywords to identify clothing genres, we proposed a novel approach for automatically classifying clothing genres based on the visually differentiable style elements. A set of style elements, that are crucial for recognizing specific visual styles of clothing genres, were identified based on the clothing design theory. In addition, the corresponding salient visual features of each style element were identified and formulated with variables that can be computationally derived with various computer vision algorithms. To evaluate the performance of our algorithm, a dataset containing 3250 full-body shots crawled from popular online stores was built. Recognition results show that our proposed algorithms achieved promising overall precision, recall, and -score of 88.76%, 88.53%, and 88.64% for recognizing upperwear genres, and 88.21%, 88.17%, and 88.19% for recognizing lowerwear genres, respectively. The effectiveness of each style element and its visual features on recognizing clothing genres was demonstrated through a set of experiments involving different sets of style elements or features. In summary, our experimental results demonstrate the effectiveness of the proposed method in clothing genre recognition.
-Energy efficiency and positional accuracy are often contradictive goals. We propose to decrease power consumption without sacrificing significant accuracy by developing an energy-aware localization that adapts the sampling rate to target's mobility level. In this paper, an energy-aware adaptive localization system based on signal strength fingerprinting is designed, implemented, and evaluated. Promising to satisfy an application's requirements on positional accuracy, our system tries to adapt its sampling rate to reduce its energy consumption. The contribution of this paper is
No abstract
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.