2020
DOI: 10.48550/arxiv.2001.00315
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Complexity and Efficient Algorithms for Data Inconsistency Evaluating and Repairing

Abstract: Data inconsistency evaluating and repairing are major concerns in data quality management. As the basic computing task, optimal subset repair is not only applied for cost estimation during the progress of database repairing, but also directly used to derive the evaluation of database inconsistency. Computing an optimal subset repair is to find a minimum tuple set from an inconsistent database whose remove results in a consistent subset left. Tight bound on the complexity and efficient algorithms are still unkn… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 36 publications
0
0
0
Order By: Relevance
“…In addition, data inconsistency is usually compounded by redundancy. For example, multiple tables have the same data but different inputs [153].…”
Section: Data Inconsistencymentioning
confidence: 99%
“…In addition, data inconsistency is usually compounded by redundancy. For example, multiple tables have the same data but different inputs [153].…”
Section: Data Inconsistencymentioning
confidence: 99%