2019 IEEE International Symposium on Information Theory (ISIT) 2019 # Privacy Against Brute-Force Inference Attacks

**Abstract:** Privacy-preserving data release is about disclosing information about useful data while retaining the privacy of sensitive data. Assuming that the sensitive data is threatened by a brute-force adversary, we define Guessing Leakage as a measure of privacy, based on the concept of guessing. After investigating the properties of this measure, we derive the optimal utilityprivacy trade-off via a linear program with any f -information adopted as the utility measure, and show that the optimal utility is a concave an…

Help me understand this report

Search citation statements

Paper Sections

Select...

4

1

Citation Types

1

12

0

Year Published

2020

2023

Publication Types

Select...

4

3

Relationship

2

5

Authors

Journals

(13 citation statements)

(42 reference statements)

1

12

0

“…1) Considering the case when the adversary is inferring an attribute U of X, we propose the non-stochastic brute-force guessing leakage as the ratio of the worst-case number of guesses for the adversary in the presence of the output Y and in the absence of it. This definition is consistent with the stochastic brute-force guessing leakage [17] with the exception of avoiding distributions or statistics.…”

confidence: 68%

“…1) Considering the case when the adversary is inferring an attribute U of X, we propose the non-stochastic brute-force guessing leakage as the ratio of the worst-case number of guesses for the adversary in the presence of the output Y and in the absence of it. This definition is consistent with the stochastic brute-force guessing leakage [17] with the exception of avoiding distributions or statistics.…”

confidence: 68%

“…Finally, hypothesis testing has been used to propose more general definitions of privacy, such as identifiability [129, 130]. Non‐binary hypothesis testing and guessing has been also recently used to define worst‐case measures of information leakage to define robust privacy measures [81, 131, 132].…”

confidence: 99%

“…Privacy is an important concern for the adoption of many IoT services, and there is a growing demand from consumers to keep their personal information private. Privacy has been widely studied in the literature [1][2][3][4][5][6][7][8][9][10], and a vast number of privacy measures have been introduced, including differential privacy [1], mutual information (MI) [2][3][4][5][6][7][8], total variation distance [11], maximal leakage [12,13], and guessing leakage [14], to count a few.…”

confidence: 99%

“…The user's goal is to prevent the secret from being accurately detected by the adversary while the useful data is revealed to the adversary for utility. Differently from the existing works [2,3,11,12,14,15], which typically consider a one-shot data release problem, we consider a discrete time system, and assume that the user can choose from among a finite number of data release mechanisms at each time. These might correspond to different types of sensor readings.…”

confidence: 99%