2022
DOI: 10.48550/arxiv.2210.02450
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Learning from aggregated data with a maximum entropy model

Abstract: Aggregating a dataset, then injecting some noise, is a simple and common way to release differentially private data. However, aggregated data -even without noise-is not an appropriate input for machine learning classifiers. In this work, we show how a new model, similar to a logistic regression, may be learned from aggregated data only by approximating the unobserved feature distribution with a maximum entropy hypothesis. The resulting model is a Markov Random Field (MRF), and we detail how to apply, modify an… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 15 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?