2017 IEEE 56th Annual Conference on Decision and Control (CDC) 2017
DOI: 10.1109/cdc.2017.8264529
|View full text |Cite
|
Sign up to set email alerts
|

Particle-filter-enabled real-time sensor fault detection without a model of faults

Abstract: Abstract-We are experiencing an explosion in the amount of sensors measuring our activities and the world around us. These sensors are spread throughout the built environment and can help us perform state estimation and control of related systems, but they are often built and/or maintained by third parties or system users. As a result, by outsourcing system measurement to third parties, the controller must accept their measurements without being able to directly verify the sensors' correct operation. Instead, … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2018
2018
2020
2020

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(6 citation statements)
references
References 19 publications
0
6
0
Order By: Relevance
“…However, for our current purposes this is not easily implementable because in our filtering context, we are trying to make a decision as to whether to accept or reject a measurement as we receive it. Taking a "soft" view, and considering a range of state space values based on our current range of belief of whether we should accept or reject H 0 , while potentially giving us a view of a broader range of possibilities (and separate measurement hypotheses) that we could revisit in light of future data, leads to a blow-up when the number of separate sensors increase [29]. Instead, for expedience, in what follows we adopt the Neyman-Pearson and null hypothesis significance test and select a hard significance level α, and reject or accept the measurement based on whether our estimated p-value ( 17) is larger or smaller.…”
Section: E Monte-carlo Fisherian Significance Testsmentioning
confidence: 99%
See 2 more Smart Citations
“…However, for our current purposes this is not easily implementable because in our filtering context, we are trying to make a decision as to whether to accept or reject a measurement as we receive it. Taking a "soft" view, and considering a range of state space values based on our current range of belief of whether we should accept or reject H 0 , while potentially giving us a view of a broader range of possibilities (and separate measurement hypotheses) that we could revisit in light of future data, leads to a blow-up when the number of separate sensors increase [29]. Instead, for expedience, in what follows we adopt the Neyman-Pearson and null hypothesis significance test and select a hard significance level α, and reject or accept the measurement based on whether our estimated p-value ( 17) is larger or smaller.…”
Section: E Monte-carlo Fisherian Significance Testsmentioning
confidence: 99%
“…In this paper, we will not exhaustively define all PDFs of interest in the interest of readability. See [29] for a more lengthy discussion of a precursor to the Fisher-type hypothesis-testing particle filter discussed in the present work.…”
Section: A Probabilistic Outlier-rejecting Particle Filtermentioning
confidence: 99%
See 1 more Smart Citation
“…The federation scheme is very often used in practice, but it does not take into account the possible correlation between parameter under estimation, and for correction of long-term U-type errors it needs to develop a special methodology. It is interesting to use the particle filter to combat anomalies like in [13]. The application of this algorithm in redundant systems also requires additional research.…”
Section: Introductionmentioning
confidence: 99%
“…For the mathematical description of such errors the heavy-tailed distributions are used instead of the Gaussian distribution [31]. The filtering problem for heavy-tailed distributions can be solved approximately using the so-called particle filters, which are based on the Monte Carlo method [13,32]. However, the complexity of such algorithms increases exponentially with the increase of the state space dimension, which requires large computational resources.…”
Section: Introductionmentioning
confidence: 99%