Internet of Things (IoT) is an innovative paradigm envisioned to provide massive applications that are now part of our daily lives. Millions of smart devices are deployed within complex networks to provide vibrant functionalities including communications, monitoring, and controlling of critical infrastructures. However, this massive growth of IoT devices and the corresponding huge data traffic generated at the edge of the network created additional burdens on the state-of-the-art centralized cloud computing paradigm due to the bandwidth and resources scarcity. Hence, edge computing (EC) is emerging as an innovative strategy that brings data processing and storage near to the end users, leading to what is called EC-assisted IoT. Although this paradigm provides unique features and enhanced quality of service (QoS), it also introduces huge risks in data security and privacy aspects. This paper conducts a comprehensive survey on security and privacy issues in the context of EC-assisted IoT. In particular, we first present an overview of EC-assisted IoT including definitions, applications, architecture, advantages, and challenges. Second, we define security and privacy in the context of EC-assisted IoT. Then, we extensively discuss the major classifications of attacks in EC-assisted IoT and provide possible solutions and countermeasures along with the related research efforts. After that, we further classify some security and privacy issues as discussed in the literature based on security services and based on security objectives and functions. Finally, several open challenges and future research directions for secure EC-assisted IoT paradigm are also extensively provided.
Fifth-generation (5G) cellular systems are likely to operate in the centimeter-wave (3-30 GHz) and millimeter-wave (30-300 GHz) frequency bands, where a vast amount of underutilized bandwidth exists worldwide. To assist in the research and development of these emerging wireless systems, a myriad of measurement studies have been conducted to characterize path loss in urban environments at these frequencies. The standard theoretical free space (FS) and Stanford University Interim (SUI) empirical path loss models were recently modified to fit path loss models obtained from measurements performed at 28 GHz and 38 GHz, using simple correction factors. In this paper, we provide similar correction factors for models at 60 GHz and 73 GHz. By imparting slope correction factors on the FS and SUI path loss models to closely match the close-in (CI) free space reference distance path loss models, millimeter-wave path loss can be accurately estimated (with popular models) for 5G cellular planning at 60 GHz and 73 GHz. Additionally, new millimeterwave beam combining path loss models are provided at 28 GHz and 73 GHz by considering the simultaneous combination of signals from multiple antenna pointing directions between the transmitter and receiver that result in the strongest received power. Such directional channel models are important for future adaptive array systems at millimeter-wave frequencies.
<div>Next generation wireless networks are expected to be extremely complex due to their massive heterogeneity in terms of the types of network architectures they incorporate, the types and numbers of smart IoT devices they serve, and the types of emerging applications they support. In such large-scale and heterogeneous networks (HetNets), radio resource allocation and management (RRAM) becomes one of the major challenges encountered during system design and deployment. In this context, emerging Deep Reinforcement Learning (DRL) techniques are expected to be one of the main enabling technologies to address the RRAM in future wireless HetNets. In this paper, we conduct a systematic in-depth, and comprehensive survey of the applications of DRL techniques in RRAM for next generation wireless networks. Towards this, we first overview the existing traditional RRAM methods and identify their limitations that motivate the use of DRL techniques in RRAM. Then, we provide a comprehensive review of the most widely used DRL algorithms to address RRAM problems, including the value- and policy-based algorithms. The advantages, limitations, and use-cases for each algorithm are provided. We then conduct a comprehensive and in-depth literature review and classify existing related works based on both the radio resources they are addressing and the type of wireless networks they are investigating. To this end, we carefully identify the types of DRL algorithms utilized in each related work, the elements of these algorithms, and the main findings of each related work. Finally, we highlight important open challenges and provide insights into several future research directions in the context of DRL-based RRAM. This survey is intentionally designed to guide and stimulate more research endeavors towards building efficient and fine-grained DRL-based RRAM schemes for future wireless networks.</div>
Next generation wireless networks are expected to be extremely complex due to their massive heterogeneity in terms of the types of network architectures they incorporate, the types and numbers of smart IoT devices they serve, and the types of emerging applications they support. In such large-scale and heterogeneous networks (HetNets), radio resource allocation and management (RRAM) becomes one of the major challenges encountered during system design and deployment. In this context, emerging Deep Reinforcement Learning (DRL) techniques are expected to be one of the main enabling technologies to address the RRAM in future wireless HetNets. In this paper, we conduct a systematic in-depth, and comprehensive survey of the applications of DRL techniques in RRAM for next generation wireless networks. Towards this, we first overview the existing traditional RRAM methods and identify their limitations that motivate the use of DRL techniques in RRAM. Then, we provide a comprehensive review of the most widely used DRL algorithms to address RRAM problems, including the value-and policy-based algorithms. The advantages, limitations, and use-cases for each algorithm are provided. We then conduct a comprehensive and in-depth literature review and classify existing related works based on both the radio resources they are addressing and the type of wireless networks they are investigating. To this end, we carefully identify the types of DRL algorithms utilized in each related work, the elements of these algorithms, and the main findings of each related work. Finally, we highlight important open challenges and provide insights into several future research directions in the context of DRL-based RRAM. This survey is intentionally designed to guide and stimulate more research endeavors towards building efficient and fine-grained DRL-based RRAM schemes for future wireless networks.
This paper presents analytical and empirical data documenting the effects of solar radio emissions on outdoor propagation path loss at 60 GHz bands. Both line-of-sight (LOS) and non-LOS scenarios were considered. The setup used in the empirical studies emulates the future fifth-generation cellular systems for both access and backhaul services, as well as for device-to-device communications. Based on the measurement data collected in sunny weather with intense solar activities, we developed large-scale propagation path loss models at 60 GHz, and observed the effects of solar radio emissions on the path loss data. It is shown that solar radio emission can decrease carrierto-noise ratio, and that this translates into a corresponding increase in the path loss exponent (PLE) values for the largescale propagation path loss channel models. Empirical data show that 9.0%-15.6% higher PLE values were observed in hot and sunny weather during the day (41°-42°C) compared with the counterpart measurements taken at night in cool and clear weather (20°-38°C). This translates into a corresponding decrease in 60 GHz radio coverage in hot and sunny weather during the day. The empirical data are closely corroborated by analytical estimates presented. Index Terms-Fifth-generation (5G) cellular communications, path loss models, propagation measurements at 60 GHz, solar radio noise. I. INTRODUCTION M ILLIMETER wave (mmWave) spectrum, 30-300 GHz, will play a key role in the fifth-generation (5G) cellular networks, which aim to provide multigigabit per second (Gb/s) data rates over wireless links. Experts have been exploring the prospects of the 28, 38, 60, and 73 GHz mmWave bands for 5G systems. The unlicensed spectrum at 60 GHz offers 10-100 times more spectrum than what is available today Manuscript
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.