The ever-growing Internet of Things (IoT) data traffic is one of the primary research focuses of future mobile networks. 3rd Generation Partnership Project (3GPP) standards like Long Term Evolution-Advanced (LTE-A) have been designed for broadband services. However, IoT devices are mainly based on narrowband applications. Standards like LTE-A might not provide efficient spectrum utilization when serving IoT applications. The aggregation of IoT data at an intermediate node before transmission can answer the issues of spectral efficiency. The objective of this work is to utilize the low cost 3GPP fixed, inband, layer-3 Relay Node (RN) for integrating IoT traffic into 5G network by multiplexing data packets at the RN before transmission to the Base Station (BS) in the form of large multiplexed packets. Frequency resource blocks can be shared among several devices with this method. An analytical model for this scheme, developed as an r-stage Coxian process, determines the radio resource utilization and system gain achieved. The model is validated by comparing the obtained results with simulation results.
Artificial Intelligence (AI) and Machine Learning (ML) are envisaged to play key roles in 5G networks. Efficient radio resource management is of paramount importance for network operators. With the advent of newer technologies, infrastructure, and plans, spending significant radio resources on estimating channel conditions in mobile networks poses a challenge. Automating the process of predicting channel conditions can efficiently utilize resources. To this point, we propose an ML-based technique, i.e., an Artificial Neural Network (ANN) for predicting SINR (Signal-to-Interference-and-Noise-Ratio) in order to mitigate the radio resource usage in mobile networks. Radio resource scheduling is generally achieved on the basis of estimated channel conditions, i.e., SINR with the help of Sounding Reference Signals (SRS). The proposed Non-Linear Auto Regressive External/Exogenous (NARX)-based ANN aims to minimize the rate of sending SRS and achieves an accuracy of R = 0.87. This can lead to vacating up to 4% of the spectrum, improving bandwidth efficiency and decreasing uplink power consumption.
Long‐Term Evolution employs a hard handover procedure. To reduce the interruption of data flow, downlink data is forwarded from the serving eNodeB (eNB) to the target eNB during handover. In cellular networks, unbalanced loads may lead to congestion in both the radio network and the backhaul network, resulting in bad end‐to‐end performance as well as causing unfairness among the users sharing the bottleneck link. This work focuses on congestion in the transport network. Handovers toward less loaded cells can help redistribute the load of the bottleneck link; such a mechanism is known as load balancing. The results show that the introduction of such a handover mechanism into the simulation environment positively influences the system performance. This is because terminals spend more time in the cell; hence, a better reception is offered. The utilization of load balancing can be used to further improve the performance of cellular systems that are experiencing congestion on a bottleneck link due to an uneven load.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.