Wireless sensor networks (WSNs) are an emerging technology used in many applications in both the civilian and military domains. Typically, these networks are deployed in remote and hostile environments. They are vulnerable to various kinds of security attacks, of which sybil attacks are some of the most harmful. Thus, it is necessary to solve the problems related to sensor node constraints and the need for high WSN security. This paper proposes an energy trust system (ETS) for WSNs to effectively detect sybil attacks. It employs multi-level detection based on identity and position verification. Then, a trust algorithm is applied based on the energy of each sensor node. Data aggregation is also utilized to reduce communication overhead and save energy. We analyze the performance of the proposed system in terms of security and resource consumption using theoretical and simulation-based approaches. The simulation results show that the proposed ETS is effective and robust in detecting sybil attacks in terms of the true and false positive rates. By virtue of the application of multi-level detection, the proposed system achieves more than 70% detection at the first level, which significantly increases to 100% detection at the second level. Furthermore, this system reduces communication overhead, memory overhead, and energy consumption by eliminating the exchange of feedback and recommendation messages among sensor nodes.
As a possible implementation of a low-power wide-area network (LPWAN), Long Range (LoRa) technology is considered to be the future wireless communication standard for the Internet of Things (IoT) as it offers competitive features, such as a long communication range, low cost, and reduced power consumption, which make it an optimum alternative to the current wireless sensor networks and conventional cellular technologies. However, the limited bandwidth available for physical layer modulation in LoRa makes it unsuitable for high bit rate data transfer from devices like image sensors. In this paper, we propose a new method for mangrove forest monitoring in Malaysia, wherein we transfer image sensor data over the LoRa physical layer (PHY) in a node-to-node network model. In implementing this method, we produce a novel scheme for overcoming the bandwidth limitation of LoRa. With this scheme the images, which requires high data rate to transfer, collected by the sensor are encrypted as hexadecimal data and then split into packets for transfer via the LoRa physical layer (PHY). To assess the quality of images transferred using this scheme, we measured the packet loss rate, peak signal-to-noise ratio (PSNR), and structural similarity (SSIM) index of each image. These measurements verify the proposed scheme for image transmission, and support the industrial and academic trend which promotes LoRa as the future solution for IoT infrastructure.
The routing protocol for Wireless Sensor Networks (WSNs) is defined as the manner of data dissemination from the network field (source) to the base station (destination). Based on the network topology, there are two types of routing protocols in WSNs, they are namely flat routing protocols and hierarchical routing protocols. Hierarchical routing protocols (HRPs) are more energy efficient and scalable compared to flat routing protocols. This paper discusses how topology management and network application influence the performance of cluster-based and chain-based hierarchical networks. It reviews the basic features of sensor connectivity issues such as power control in topology setup , sleep/idle pairing and data transmission control that are used in five common HRPs, and it also examines their impact on the protocol performance. A good picture of their respective performances give an indication how network applications, i.e whether reactive or proactive, and topology management i.e. whether centralized or distributed would determine the network performance. Finally, from the ensuring discussion, it is shown that the chain-based HRPs guarantee a longer network lifetime compared to cluster-based HRPs by three to five times.
The fifth-generation (5G) technology offers more capacity and data rates than the previous generations. It provides ultra-low latency and ultra-high dependability, allowing for efficient services in many industries. Using radiofrequency electromagnetic fields (RF-EMF) above 6 GHz in 5G millimeter Wave(mm-Wave) base stations has concerned many people due to the potential health risks caused by EMF exposure. This study aims to measure the maximum exposure emitted by a 5G mm-Wave base station by utilizing international standards in both its assessment methodology and exposure limits. In this study, the R&S®TSMA6 scanner, R&S®ROMES4 software, and R&S®TSME30DC down converter have been used for the measurement campaign; in addition to the user equipment device (UE), GPS, and an omnidirectional antenna. The investigation is based on a code selective method due to the radiated power fluctuations over time with data traffic. To conduct the measurement, six tests are taken based on three different time frames, antenna directions, and user equipment device (UE) to investigate the RF-EMF exposure. The maximum and average exposure from the 5G mm-Wave base station are calculated and compared with the ICNIRP standard. The maximum exposure from the 29.5 GHz base station is found to be 5.71 V/m, and the highest amount of average exposure is 2.02V/m. In this study, it was found that the maximum and average exposure (RF-EMF) produced from a single 5G mm-Wave base station are well within the allowed RF-EMF standard limit.INDEX TERMS 5G mm-Wave BS, massive MIMO, radiofrequency electromagnetic fields (RF-EMF), measurement.
With the development of wireless mobile communication technology, the demand for wireless communication rate and frequency increases year by year. Existing wireless mobile communication frequency tends to be saturated, which demands for new solutions. Terahertz (THz) communication has great potential for the future mobile communications (Beyond 5G), and is also an important technique for the high data rate transmission in spatial information network. THz communication has great application prospects in military-civilian integration and coordinated development. In China, important breakthroughs have been achieved for the key techniques of THz high data rate communications, which is practically keeping up with the most advanced technological level in the world. Therefore, further intensifying efforts on the development of THz communication have the strategic importance for China in leading the development of future wireless communication techniques and the standardization process of Beyond 5G. This paper analyzes the performance of the MIMO channel in the Terahertz (THz) band and a discrete mathematical method is used to propose a novel channel model. Then, a channel capacity model is proposed by the combination of path loss and molecular absorption in the THz band based on the CSI at the receiver. Simulation results show that the integration of MIMO in the THz band gives better data rate and channel capacity as compared with a single channel.
To satisfy the demand for higher data rate while maintaining the quality of service, a dense long-term evolution (LTE) cells environment is required. This imposes a big challenge to the network when it comes to performing handover (HO). Cell selection has an important influence on network performance, to achieve seamless handover. Although a successful handover is accomplished, it might be to a wrong cell when the selected cell is not an optimal one in terms of signal quality and bandwidth. This may cause significant interference with other cells, handover failure (HOF), or handover ping-pong (HOPP), consequently degrading the cell throughput. To address this issue, we propose a multiple-criteria decision-making method. In this method, we use an integrated fuzzy technique for order preference by using similarity to ideal solution (TOPSIS) on S-criterion, availability of resource blocks (RBs), and uplink signal-to-interference-plus-noise ratio. The conventional cell selection in LTE is based on S-criterion, which is inadequate since it only relies on downlink signal quality. A novel method called fuzzy multiple-criteria cell selection (FMCCS) is proposed in this paper. FMCCS considers RBs utilization and user equipment uplink condition in addition to Scriterion. System analysis demonstrates that FMCCS managed to reduce handover ping-pong and handover failure significantly. This improvement stems from the highly reliable cellselection technique that leads to increased throughput of the cell with a successful handover. The simulation results show that FMCCS outperforms the conventional and cell selection scheme (CSS) methods.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
334 Leonard St
Brooklyn, NY 11211
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.