<p>In this work, we investigate short-packet communications for multiple-input multiple-output underlay cognitive multihop relay internet-of-things (IoT) networks with multiple primary users, where IoT devices transmit and receive short packets to provide low-latency and ultra-reliable communications (uRLLCs). For performance evaluation, the closed-form expressions of the end-to-end (E2E) block error rate (BLER) for the considered systems are derived in a practical scenario under imperfect channel state information of the interference channels, from which the E2E throughput, energy efficiency (EE), latency, reliability, and asymptotic analysis are also studied. Based on the analytical results, we adapt some state-of-the-art machine learning (ML)-aided estimators to predict the system performance in terms of the E2E throughput, EE, latency, and reliability for real-time configurations in IoT systems. We also obtain the closed-form expressions for the optimal power-allocation and relay-location strategies to minimize the asymptotic E2E BLER under the proportional tolerable interference power and uRLLC constraints, which require negligible computational complexity and offer significant power savings. Furthermore, the ML-based evaluation achieves equivalent performance while significantly reducing the execution time compared to conventional analytical and simulation methods. Among the ML frameworks, the extreme gradient boosting model is demonstrated to be the most efficient estimator for future practical IoT applications.</p>
This work investigates a data-driven approach to detect the number of incoming signals for a lens antenna array (LAA). First, the energy-focusing property of an electromagnetic (EM) lens is utilized to generate an input spectrum, which can be used to enumerate both the multipath and independent signals. Next, we present the deep learning (DL)-assisted sharp peak recognition method referred to as the power spectrum-based convolutional neural network (PSCNet). Unlike classical techniques, such as constant false alarm rate (CFAR) detection, this data-driven detector can count received signals adaptively based on the LAA power spectrum without requiring any initial configurations. In addition, the PSCNet outperforms other state-of-the-art subspace-based detectors, even under challenging conditions, such as a low signal-to-noise ratio (SNR), a small observation size, and angular ambiguity. For the training phase, we propose a pretrained-model reusing strategy and an input pre-processing approach referred to as the power spectrum shortening (PSS) to alleviate the training burden and achieve lower complexity compared to fully retraining all isolated networks. The simulation results demonstrate that our proposed sharp peakrecognition algorithm not only accomplishes the improved signal enumeration performance but also requires lower computational resources than other subspace-based approaches.INDEX TERMS Signal enumeration; lens antenna array (LAA); convolutional neural network (CNN); signal power spectrum.
<p>In this work, we investigate short-packet communications for multiple-input multiple-output underlay cognitive multihop relay internet-of-things (IoT) networks with multiple primary users, where IoT devices transmit and receive short packets to provide low-latency and ultra-reliable communications (uRLLCs). For performance evaluation, the closed-form expressions of the end-to-end (E2E) block error rate (BLER) for the considered systems are derived in a practical scenario under imperfect channel state information of the interference channels, from which the E2E throughput, energy efficiency (EE), latency, reliability, and asymptotic analysis are also studied. Based on the analytical results, we adapt some state-of-the-art machine learning (ML)-aided estimators to predict the system performance in terms of the E2E throughput, EE, latency, and reliability for real-time configurations in IoT systems. We also obtain the closed-form expressions for the optimal power-allocation and relay-location strategies to minimize the asymptotic E2E BLER under the proportional tolerable interference power and uRLLC constraints, which require negligible computational complexity and offer significant power savings. Furthermore, the ML-based evaluation achieves equivalent performance while significantly reducing the execution time compared to conventional analytical and simulation methods. Among the ML frameworks, the extreme gradient boosting model is demonstrated to be the most efficient estimator for future practical IoT applications.</p>
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.