Computation offloading has been used and studied extensively in relation to mobile devices. That is because their relatively limited processing power and reliance on a battery render the concept of offloading any processing/energy-hungry tasks to a remote server, cloudlet or cloud infrastructure particularly attractive. However, the mobile device's tasks that are typically offloaded are not time-critical and tend to be one-off. We argue that the concept can be practical also for continuous tasks run on more powerful cyber-physical systems where timeliness is a priority. As case study, we use the process of real-time intrusion detection on a robotic vehicle. Typically, such detection would employ lightweight statistical learning techniques that can run onboard the vehicle without severely affecting its energy consumption. We show that by offloading this task to a remote server, we can utilse approaches of much greater complexity and detection strength based on deep learning. We show both mathematically and experimentally that this allows not only greater detection accuracy, but also significant energy savings, which improve the operational autonomy of the vehicle. In addition, the overall detection latency is reduced in most of our experiments. This can be very important for vehicles and other cyber-physical systems where cyber attacks can directly affect physical safety. In fact, in some cases, the reduction in detection latency thanks to offloading is not only beneficial but necessary. An example is when detection latency onboard the vehicle would be higher than the detection period, and as a result a detection run cannot complete before the next one is scheduled, increasingly delaying consecutive detection decisions. Offloading to a remote server is an effective and energy-efficient solution to this problem too.
Abstract. Cloud computing has become common practice for a wide variety of user communities. Yet, the energy efficiency and end-to-end performance benefits of cloud computing are not fully understood. Here, we focus specifically on the trade-off between local power saving and increased execution time when work is offloaded from a user's PC to a cloud environment. We have set up a 14-node private cloud and have executed a variety of applications with different processing demands. We have measured the energy cost at the level of the individual user's PC, at the level of the cloud, as well as at the two combined, contrasted to the execution time for each application when running on the PC and when running on the cloud. Our results indicate that the tradeoff between energy cost and performance differs considerably between applications of different types. In most cases investigated, the total increase in energy consumption, incurred by running that additional application, was reduced significantly. This shows that research on using cloud computing as a means to reduce the overall carbon footprint of IT is warranted. Of course, the energy gains were more pronounced for energy-selfish users, who are only interested in reducing their own carbon footprint, but these savings came at the expense of performance, with execution time increase ranging from 1 % to 84 % for different applications.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.