Smart, secure and energy-efficient data collection (DC) processes are key to the realization of the full potentials of future Internet of Things (FIoT)-based systems. Currently, challenges in this domain have motivated research efforts towards providing cognitive solutions for IoT usage. One such solution, termed cognitive sensing (CS) describes the use of smart sensors to intelligently perceive inputs from the environment. Further, CS has been proposed for use in FIoT in order to facilitate smart, secure and energy-efficient data collection processes. In this article, we provide a survey of different Artificial Intelligence (AI)-based techniques used over the last decade to provide cognitive sensing solutions for different FIoT applications. We present some state-of-the-art approaches, potentials, and challenges of AI techniques for the identified solutions. This survey contributes to a better understanding of AI techniques deployed for cognitive sensing in FIoT as well as future research directions in this regard.
This study uses probabilistic load forecast technique to predict the load demand pattern in Ogun State for year 2018. Energy consumption data for Ogun State for year 2016 and 2017 was obtained from the regional headquarter of the Ibadan Electricity Distribution Company (IBEDC), Abeokuta. The results of the study show that the energy consumption in Ogun State has the probability tendency of rising above 98,469.40 MWHR by 2.68%. Similarly, it was also established that the probability of energy consumed in the state rising below 46,494.68 MWHR within the next few months will be 5.98%. The probability that energy consumption in year 2018 will fall between 98,469.40 MWHR and 46,494.68 MWHR is 91.84%. Energy consumption in year 2018 will mostly fall between 63,500 MWHR -86,000 MWHR. The result also indicated that energy consumption in 2018, has the highest probability of falling between 72,500 MWHR and 77,000 MWHR by 15.34%. It is unlikely it falls between 45,000 MWHR to 50,000 MWHR and 95,000 MWHR to 99,500 MWHR, with both range having their percentage probabilities at 0.19% and 2.99% respectively. Result of this study is useful to IBEDC for their operational planning and control activities.
This work uses NS3 simulation to study the effect of mobility speed on the performance of three handover algorithms in Long Term Evolution (LTE) Networks. A realistic multi-cell LTE network was set up using NS3 simulation software. Mobility models were used to vary the location of the User Equipment (UE), hence triggering handover events across the network. The performance was measured using Signal Interference Noise Ratio (SINR) and number of completed handovers. Result revealed that at a speed between the ranges of 0 -3 km/h, the Integrative algorithm performed best while at 4 -60km/h, the performance of the A3RSRP algorithm was the best with an average value of 95dB. Also, at an increased speed within the range of 60 -120 km/h, the Integrative algorithm had a slightly better performance than the A3RSRP. However, at a speed above 120 km/h, the integrative algorithm performed best with an SINR of 120dB. In terms of completed handovers, the Integrative algorithm had the least number of completed handovers throughout the entire range of considered speeds. Thus, we establish that mobility speed has a significant effect on the performance of handover algorithms. The Long-Term Evolution (LTE) system was designed with the aim of providing a higher data rates and lower latency under various mobility conditions (Dimou et al., 2009). According to (3GPP TR 25.913), the LTE system is expected to provide mobility support for User Equipment (UE) up to speeds of 500 km/h while maintaining an uninterrupted provision of high data rates and services. Mobility at high speed has always been a challenge in wireless networks and LTE was designed to overcome this challenge. To accomplish this purpose, LTE must minimize delay and packet loss in voice transmission and ensure reliability in data transmission during high-speed scenarios. In lieu of this, optimizing the handover procedure to get the required performance is considered as one important issue in mobile networks (Hämäläinen, 2011). LTE Handover is a process that transfers a UE from one evolved NodeB (eNodeB) to another eNodeB or one sector to another sector within the same eNodeB due to perceived better cell coverage from the target eNodeB (Lin et al, 2011a). This goal is achieved by analyzing a periodic or event triggered downlink received signal strength (RSS) and carrier-tointerference ratio (CIR) measurements from the UEs. The eNodeB then decides based on the received parameters on whether to handover the UE to the neighboring eNodeB or keep the UE connected to it. The decision-making process is controlled by an efficient handover algorithm as it enhances the system capacity and the service quality cost effectiveness. The performance of the LTE handover scheme depends majorly on the handover algorithm in use (Hans et al, 2014). Due to this fact, researchers have channeled efforts at optimizing existing algorithms while some new ones have been developed. Three of the numerous algorithms that have become popular in LTE networks include (i) Power Budget Handover ...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.