2015 IEEE 31st International Conference on Data Engineering 2015
DOI: 10.1109/icde.2015.7113274
|View full text |Cite
|
Sign up to set email alerts
|

Privacy-aware dynamic feature selection

Abstract: Big data will enable the development of novel services that enhance a company's market advantage, competition, or productivity. At the same time, the utilization of such a service could disclose sensitive data in the process, which raises significant privacy concerns. To protect individuals, various policies, such as the Code of Fair Information Practices, as well as recent laws require organizations to capture only the minimal amount of data necessary to support a service. While this is a notable goal, choosi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(12 citation statements)
references
References 34 publications
0
12
0
Order By: Relevance
“…(put number = -0. Third, there are relatively recent works, which are precursor to our work, that propose the idea of privacy-aware feature selection [26][27] [7][39] [38] [34]. Our work contrasts these works in either one or both of the following ways.…”
Section: Related Workmentioning
confidence: 99%
“…(put number = -0. Third, there are relatively recent works, which are precursor to our work, that propose the idea of privacy-aware feature selection [26][27] [7][39] [38] [34]. Our work contrasts these works in either one or both of the following ways.…”
Section: Related Workmentioning
confidence: 99%
“…Finally, there are methods that determine a feature order at test time, using the expected quality of the subsequent prediction to decide which feature to acquire next. For example, Pattuk et al [33] formulate a privacy-aware dynamic feature selection algorithm for classification that sequentially chooses features for a test instance, according to which will most increase the expected confidence of the next prediction, as long as including that feature does not violate a privacy constraint. This work is most responsive to the test time situation.…”
Section: Cost and # And Order At Test Timementioning
confidence: 99%
“…The Utility Function used to determine the value of a feature also varies. Examples for feature "quality" include subsequent prediction accuracy (e.g., [21,41,49,51]) and subsequent prediction uncertainty (e.g., [33]). Finally, the Domain column shows where each relevant approach was applied.…”
Section: Cost and # And Order At Test Timementioning
confidence: 99%
See 1 more Smart Citation
“…For the numerical attributes, we cut them into different categories and consider a binary attribute for each category. For instance, we partition age value in five non-overlapping intervals: [0, 25], (25,35], (35,45], (45,55], and (55, ∞], and then each of the five intervals becomes a binary attribute. Similarly, education attribute is divided into 4 intervals and hour/week attribute is divided into 5 interval.…”
Section: Entity Disambiguation (Ed)mentioning
confidence: 99%