Fault current limiters are essential devices used to protect the power system and its equipment against high levels of fault current, which are growing up due to the increase of new power sources. This paper proposes a novel design of a Hybrid Superconducting Fault Current Limiter (Hybrid SFCL), which is composed basically by thyristors in series with a superconducting element. This branch is connected in parallel to an air-core reactor, which improves limitation and ensures the safe operation of the superconductor element. Another advantage of this topology is the use of the voltage drop in the superconductor as an input parameter to the controller. This voltage is used to detect the fault, which avoids the need for a current sensor and, consequently, reduces the manufacturing costs. In this work, the PSCAD/EMTDC software was employed to modeling the Hybrid SFCL and the 2G superconducting tape, which was modeled considering the thermal-electrical analysis. The results show that the fault current is efficiently limited, and the developed controller strategy has shown a relatively good performance. Furthermore, the proposed system guarantees a fast recovery time, in the order of 500 ms, which is a good advantage when compared to the conventional resistive SFCL.
A full-rank lattice in the Euclidean space is a discrete set formed by all integer linear combinations of a basis. Given a probability distribution on R n , two operations can be induced by considering the quotient of the space by such a lattice: wrapping and quantization. For a lattice Λ, and a fundamental domain D which tiles R n through Λ, the wrapped distribution over the quotient is obtained by summing the density over each coset, while the quantized distribution over the lattice is defined by integrating over each fundamental domain translation. These operations define wrapped and quantized random variables over D and Λ, respectively, which sum up to the original random variable. We investigate information-theoretic properties of this decomposition, such as entropy, mutual information and the Fisher information matrix, and show that it naturally generalizes to the more abstract context of locally compact topological groups.
Choosing a suitable loss function is essential when learning by empirical risk minimisation. In many practical cases, the datasets used for training a classifier may contain incorrect labels, which prompts the interest for using loss functions that are inherently robust to label noise. In this paper, we study the Fisher-Rao loss function, which emerges from the Fisher-Rao distance in the statistical manifold of discrete distributions. We derive an upper bound for the performance degradation in the presence of label noise, and analyse the learning speed of this loss. Comparing with other commonly used losses, we argue that the Fisher-Rao loss provides a natural trade-off between robustness and training dynamics. Numerical experiments with synthetic and MNIST datasets illustrate this performance.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.