2012
DOI: 10.3390/e15010032
|View full text |Cite
|
Sign up to set email alerts
|

Function Based Fault Detection for Uncertain Multivariate Nonlinear Non-Gaussian Stochastic Systems Using Entropy Optimization Principle

Abstract: In this paper, the fault detection in uncertain multivariate nonlinear non-Gaussian stochastic systems is further investigated. Entropy is introduced to characterize the stochastic behavior of the detection errors, and the entropy optimization principle is established for the fault detection problem. The principle is to maximize the entropies of the stochastic detection errors in the presence of faults and to minimize the entropies of the detection errors in the presence of disturbances. In order to calculate … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
8
0

Year Published

2013
2013
2023
2023

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(8 citation statements)
references
References 38 publications
0
8
0
Order By: Relevance
“…Also similarly to E in (20), E A introduced in Equation (21) is also a normalized measure and it's possible maximum value E A = max = 1 can be obtained if all AP markers for a sample y(k) appear for all detection sensitivities in . The possible minimum value of E A of a sample y(k) is zero if no markers appear [similarly as to E in Equation (20)].…”
Section: Approximate Individual Sample Learning Entropy (Aisle)mentioning
confidence: 99%
See 3 more Smart Citations
“…Also similarly to E in (20), E A introduced in Equation (21) is also a normalized measure and it's possible maximum value E A = max = 1 can be obtained if all AP markers for a sample y(k) appear for all detection sensitivities in . The possible minimum value of E A of a sample y(k) is zero if no markers appear [similarly as to E in Equation (20)].…”
Section: Approximate Individual Sample Learning Entropy (Aisle)mentioning
confidence: 99%
“…For every individual sample, the measure E A in Equation (21) approximates E in (20), because larger values of E A , corresponds to a steeper slope H ( Figure 5). Thus E A can be called the…”
Section: Approximate Individual Sample Learning Entropy (Aisle)mentioning
confidence: 99%
See 2 more Smart Citations
“…The Sample Entropy (SampEn) and the Approximate Entropy (ApEn) are very typical and very relevant examples to be mentioned [3]- [4]. These approaches are closely related to the multi-scale evaluation of fractal measures, where further case studies utilizing SampEn, ApEn, and Multiscale Entropy (MSE) can be found in [5]- [7]. Further to this, probabilistic entropy approach to the concept shift (sometimes the concept drift) detection in sensory data is reported in [8].…”
Section: Introductionmentioning
confidence: 99%