Modern intelligent energy grids enable energy supply and consumption to be efficiently managed while simultaneously avoiding a variety of security risks. System disturbances can be caused by both naturally occurring and human-made events. Operators should be aware of the different kinds and causes of disturbances in the energy systems to make informed decisions and respond accordingly. This study addresses this problem by proposing an attack detection model on the basis of deep learning for energy systems, which could be trained utilizing data and logs gathered through phasor measurement units (PMUs). Property or specification making is used to create features, and data are sent to various machine learning methods, of which random forest has been selected as the basic classifier of AdaBoost. Open-source simulated energy system data are used to test the model containing 37 energy system event case studies. In the end, the suggested model has been compared with other layouts according to various assessment metrics. The simulation outcomes showed that this model achieves a detection rate of 93.6% and an accuracy rate of 93.91%, which is greater compared to the existing methods.
The evolution of the global wireless market is accompanied by an increased need in terms of speed and number of users, lower latency, better coverage, better spectral efficiency and quality of service, etc. To meet these needs, 5G has recently been introduced as an effective solution which targets, via the large scale deployment of symmetric antennas, a wide variety of sectors such as energy, health, media, industry, transport and especially wireless cellular networks which are among the most important pillars of modern societies. Multiple Input, Multiple Output (MIMO) systems, which have been extended to “Massive MIMO” mode and which consist of increasing the number of radiating elements involved in the transmission and reception of the radio link, are a very promising solution for improving the spectral efficiency of wireless communication systems (WCSs). Motivated by the aforementioned developments, the present paper investigates the increased capacity of MIMO systems to improve transmission in WCSs using 5G. It carefully focuses on the evaluation of the development level and technical contribution of MIMO systems and millimeter wave (mmWave) bands in 5G wireless cellular networks (WCNs) and gives important recommendations.
This article presents numerical analysis of an ultrathin concentric hexagonal ring resonator (CHRR) metamaterial absorber (MMA) for ultrawideband visible and infrared optical window applications. The proposed MMA exhibits an absorption of above 90% from 380 to 2500 nm and an average absorbance of 96.64% at entire operational bandwidth with a compact unit cell size of 66 × 66 nm2. The designed MMA shows maximum absorption of 99% at 618 nm. The absorption bandwidth of the MMA covers the entire visible and infrared optical windows. The nickel material has been used to design the top and bottom layer of MMA, where aluminium nitride (AlN) has been used as the substrate. The designed hexagonal MMA shows polarization-independent properties due to the symmetry of the design and a stable absorption label is also achieved for oblique incident angles up to 70 °C. The absorption property of hexagonal ring resonator MMA has been analyzed by design evaluation, parametric and various material investigations. The metamaterial property, surface current allocation, magnetic field and electric field have also been analyzed to explore the absorption properties. The proposed MMA has promising prospects in numerous applications like infrared detection, solar cells, gas detection sensors, imaging, etc.
Advances in PV technology have given rise to the increasing integration of PV-based distributed generation (PVDG) systems into distribution systems to mitigate the dependence on one power source and alleviate the global warming caused by traditional power plants. However, high power output coming from intermittent PVDG can create reverse power flow, which can cause an increase in system power losses and a distortion in the voltage profile. Therefore, the appropriate placement and sizing of a PVDG coupled with an energy storage system (ESS) to stock power during off-peak hours and to inject it during peak hours are necessary. Within this context, a new methodology based on an optimal power flow management strategy for the optimal allocation and sizing of PVDG systems coupled with battery energy storage (PVDG-BES) systems is proposed in this paper. To do this, this problem is formulated as an optimization problem where total real power losses are considered as the objective function. Thereafter, a new optimization technique combining a genetic algorithm with various chaotic maps is used to find the optimal PVDG-BES placement and size. To test the robustness and applicability of the proposed methodology, various benchmark functions and the IEEE 14-bus distribution network under fixed and intermittent load profiles are used. The simulation results prove that obtaining the optimal size and placement of the PVDG-BES system based on an optimal energy management strategy (EMS) presents better performance in terms of power losses reduction and voltage profile amelioration. In fact, the total system losses are reduced by 20.14% when EMS is applied compared with the case before integrating PVDG-BES.
The economic emission dispatch problem (EEDP) is a nonconvex and nonsmooth multiobjective optimization problem in the power system field. Generally, fuel cost and total emissions of harmful gases are the problem objective functions. The EEDP decision variables are output powers of thermal generating units (TGUs). To make the EEDP problem more practical, valve point loading effects (VPLEs), prohibited operation zones (POZs), and power balance constraints should be included in the problem constraints. In order to solve this complex and constrained EEDP, a new multiobjective optimization technique combining the differential evolution (DE) algorithm and chaos theory is proposed in this study. In this new multiobjective optimization technique, a nondomination sorting principle and a crowding distance calculation are employed to extract an accurate Pareto front. To avoid being trapped in local optima and enhance the conventional DE algorithm, two different chaotic maps are used in its initialization, crossover, and mutation phases instead of random numbers. To overcome difficulties caused by the equality constraint describing the power balance constraint, a slack TGU is defined to compensate for the gap between the total generation and the sum of the system load and total power losses. Then, the optimal power outputs of all thermal units except the slack unit are determined by the suggested optimization technique. To assess the effectiveness and applicability of the proposed method for solving the EEDP, the six-unit and ten-unit systems are used. Moreover, obtained results are compared with other new optimization techniques already developed and tested for the same purpose. The superior performance of the ChMODE is also evaluated by using various metrics such as inverted generational distance (IGD), hyper-volume (HV), spacing metric (SM), and the average satisfactory degree (ASD).
Over the past few years, the Bitcoin-based financial trading system (BFTS) has created new challenges for the power system due to the high-risk consumption of mining devices. Briefly, such a problem would be a compelling incentive for cyber-attackers who intend to inflict significant infections on a power system. Simply put, an effort to phony up the consumption data of mining devices results in the furtherance of messing up the optimal energy management within the power system. Hence, this paper introduces a new cyber-attack named miner-misuse for power systems equipped by transaction tech. To overwhelm this dispute, this article also addresses an online coefficient anomaly detection approach with reliance on the reinforcement learning (RL) concept for the power system. On account of not being sufficiently aware of the system, we fulfilled the Observable Markov Decision Process (OMDP) idea in the RL mechanism in order to barricade the miner attack. The proposed method would be enhanced in an optimal and punctual way if the setting parameters were properly established in the learning procedure. So to speak, a hybrid mechanism of the optimization approach and learning structure will not only guarantee catching in the best and most far-sighted solution but also become the high converging time. To this end, this paper proposes an Intelligent Priority Selection (IPS) algorithm merging with the suggested RL method to become more punctual and optimum in the way of detecting miner attacks. Additionally, to conjure up the proposed detection approach’s effectiveness, mathematical modeling of the energy consumption of the mining devices based on the hashing rate within BFTS is provided. The uncertain fluctuation related to the needed energy of miners makes energy management unpredictable and needs to be dealt with. Hence, the unscented transformation (UT) method can obtain a high chance of precisely modeling the uncertain parameters within the system. All in all, the F-score value and successful probability of attack inferred from results revealed that the proposed anomaly detection method has the ability to identify the miner attacks as real-time-short as possible compared to other approaches.
In recent years, the occurrence of cascading failures and blackouts arising from cyber intrusions in the underlying configuration of power systems has increasingly highlighted the need for effective power management that is able to handle this issue properly. Moreover, the growing use of renewable energy resources demonstrates their irrefutable comparative usefulness in various areas of the grid, especially during cascading failures. This paper aims to first identify and eventually protect the vulnerable areas of these systems by developing a hybrid structure-based microgrid against malicious cyber-attacks. First, a well-set model of system vulnerability indices is presented to indicate the generation unit to which the lines or buses are directly related. Indeed, we want to understand what percentage of the grid equipment, such as the lines, buses, and generators, are vulnerable to the outage of lines or generators arising from cyber-attacks. This can help us make timely decisions to deal with the reduction of the vulnerability indices in the best way possible. The fact is that employing sundry renewable resources in efficient areas of the grid can remarkably improve system vulnerability mitigation effectiveness. In this regard, this paper proposes an outstanding hybrid-energy framework of AC/DC microgrids made up of photovoltaic units, wind turbine units, tidal turbine units, and hydrogen-based fuel cell resources, all of which are in grid-connect mode via the main grid, with the aim to reduce the percentage of the system that is vulnerable. To clearly demonstrate the proposed solution’s effectiveness and ease of use in the framework, a cyber-attack of the false data injection (FDI) type is modeled and developed on the studied system to corrupt information (for instance, via settings on protective devices), leading to cascading failures or large-scale blackouts. Another key factor that can have a profound impact on the unerring vulnerability analysis concerns the uncertainty parameters that are modeled by the unscented transform (UT) in this study. From the results, it can be inferred that vulnerability percentage mitigation can be achieved by the proposed hybrid energy framework based on its effectiveness in the system against the modeled cyber-attacks.
Technological development in biomedical procedures has given an upper understanding of the ease of evaluating and handling critical scenarios and diseases. A sustainable model design is required for the post-medical procedures to maintain the consistency of medical treatment. In this article, a telerobotic-based stroke rehabilitation optimization and recommendation technique cum framework is proposed and evaluated. Selecting optimal features for training deep neural networks can help in optimizing the training time and also improve the performance of the model. To achieve this, we have used Whale Optimization Algorithm (WOA) due to its higher convergence accuracy, better stability, stronger global search ability, and faster convergence speed to streamline the dependency matrix of each attribute associated with post-stroke rehabilitation. Deep Neural Networking assures the selection of datasets from training and testing validation. The proposed framework is developed on providing decision support with a recommendation of activities and task flow, these recommendations are independent and have higher feasibility with the scenario of evaluation. The proposed model achieved a precision of 99.6%, recall of 99.5 %, F1-score of 99.7%, and accuracy of 99.9%, which outperform the other considered optimization algorithms such as antlion and gravitational search algorithms. The proposed technique has provided an efficient recommendation model compared to the trivial SVM-based models and techniques.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.