The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
2021 American Control Conference (ACC) 2021
DOI: 10.23919/acc50511.2021.9483171
|View full text |Cite
|
Sign up to set email alerts
|

Differentially Private Outlier Detection in Multivariate Gaussian Signals

Abstract: The detection of outliers in data, while preserving the privacy of individual agents who contributed to the data set, is an increasingly important task when monitoring and controlling large-scale systems. In this paper, we use an algorithm based on the sparse vector technique to perform differentially private outlier detection in multivariate Gaussian signals. Specifically, we derive analytical expressions to quantify the trade-off between detection accuracy and privacy. We validate our analytical results thro… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 19 publications
0
2
0
Order By: Relevance
“…It provides each individual agent with the guarantee that the output of the considered query will not be significantly altered by whether or not they contribute their data, or what value they contribute. Differential privacy can be achieved through input perturbation, output perturbation [11], [17] or the sparse vector technique [18], [19]. In this paper, we consider input perturbation, which has the advantage that each individual agent can perturb its data before sending it to a data aggregator, thereby eliminating the need to trust the aggregator.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…It provides each individual agent with the guarantee that the output of the considered query will not be significantly altered by whether or not they contribute their data, or what value they contribute. Differential privacy can be achieved through input perturbation, output perturbation [11], [17] or the sparse vector technique [18], [19]. In this paper, we consider input perturbation, which has the advantage that each individual agent can perturb its data before sending it to a data aggregator, thereby eliminating the need to trust the aggregator.…”
Section: Introductionmentioning
confidence: 99%
“…scalar quantities are considered, we consider multivariate signals provided by individual agents who may be correlated. Unlike [19] where the sparse vector technique is used, we consider scenarios in which individual agents do not necessarily trust the data aggregator, and therefore we design an input perturbation architecture to guarantee differential privacy for the agents' data. Using the squared Mahalanobis distance, we derive analytical formulas for the trade-offs between the accuracy of detection and privacy level.…”
Section: Introductionmentioning
confidence: 99%