At cellular wireless communication systems, channel estimation (CE) is one of the key techniques that are used in Orthogonal Frequency Division Multiplexing modulation (OFDM). The most common methods are Decision‐Directed Channel Estimation, Pilot-Assisted Channel Estimation (PACE) and blind channel estimation. Among them, PACE is commonly used and has a steadier performance. Applying deep learning (DL) methods in CE is getting increasing interest of researchers during the past 3 years. The main objective of this paper is to assess the efficiency of DL-based CE compared to the conventional PACE techniques, such as least-square (LS) and minimum mean-square error (MMSE) estimators. A simulation environment to evaluate OFDM performance at different channel models has been used. A DL process that estimates the channel from training data is also employed to get the estimated impulse response of the channel. Two channel models have been used in the comparison: Tapped Delay Line and Clustered Delay Line channel models. The performance is evaluated under different parameters including number of pilots (64 pilots or 8 pilots), number of subcarriers (64), the length of cyclic prefix (16 or 0 samples) and carrier frequency (4 GHz) through computer simulation using MATLAB. From the simulation results, the trained DL estimator provides better results in estimating the channel and detecting the transmitted symbols compared to LS and MMSE estimators although, the complexity of the proposed LSTM estimator exceeds the equivalent LS estimator. Furthermore, the DL estimator also demonstrates its effectiveness with various pilot densities and with different cyclic prefix periods.
The internet of things (IoT) has provided a promising opportunity to build powerful systems and applications. Security is the main concern in IoT applications due to the privacy of exchanged data using limited resources of IoT devices (sensors/actuators). In this paper, we present a classification of IoT modes of operation based on the distribution of IoT devices, connectivity to the internet, and the typical field of application. It has been found that the majority of IoT services can be classified into one of four IoT modes: gateway, device to device, collaborative, and centralized. The management of either public or symmetric keys is essential for providing security. In the present paper, we survey different key management protocols concerning IoT, which we further allocate in a map table. The map table is a link between modes of operation and the associated security key management elements. The main target of this mapping table is to help designers select the optimum security technique that provides the best balance between the required security level and IoT system mode constraints.
In this paper, a system for LTE Cell Outage Compensation (COC) based on hybrid Genetic Algorithms (GA) and Artificial Neural Networks (ANN) has been proposed. COC aims to minimize the impact of cell outage which leads to decrease in operator revenue and/or the customer satisfaction. The proposed system adopts an optimization module to search for an optimal setting of a set of LTE operational parameters to achieve a targeted set of key performance indicators. The optimization process always leads to good enough solutions, but it also requires a huge number of trials. So, in the proposed system, a huge set of outage scenarios is collected along with their optimal argument settings that are acquired by the optimization module and they are used to train an artificial neural network (ANN) module, which acts as an expert that can optimally act on the different situations in real‐time mode. Simulation environment is set to evaluate different LTE measures and Key Performance Indicators (KPIs) on different outage scenarios. Simulation results proved the capability and robustness of the proposed system to minimize the number of users experiencing outage. Simulation results also show that the proposed system achieves optimal parameter settings without violating the overall system performance and with minimal processing time, while introducing significant impact on the performance of LTE.
In this work, an aluminum oxide nanocoating was prepared using the pulsed laser deposition technique to study the properties of the coating and to find the optimal conditions to achieve the highest quality of the aluminum oxide nanocoating. The structural properties were studied using X-ray diffraction. The results showed that the aluminum oxide nanocoatings were alpha phase polycrystalline structures. The surface topography was studied using atomic force microscopy. The surface topography showed that the average surface roughness ranged from 1.26 nm to 7 nm. The optical properties were studied using a UV-VIS spectrometer. It showed the energy gap within the range 4.09 eV to 3.98 eV. The hardness of the aluminum oxide nanocoatings were calculated using the nanoindentation technique and found within the range of 32.79 GPa to 10.41 GPa. According to the present work, the effect of the input parameters represented by the pulse energy and the number of pulses on the responses represented by the energy gap, hardness, and surface roughness were studied. The experiments were designed based on the L9 orthogonal array with the Taguchi approach. A multiple responsive optimizations of Takeuchi's design was done using the desirability function.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.