2013
DOI: 10.5120/10175-5041
|View full text |Cite
|
Sign up to set email alerts
|

A Comparative Analysis of Data Cleaning Approaches to Dirty Data

Abstract: Data Cleansing or (data scrubbing) is an activity involving a process of detecting and correcting the errors and inconsistencies in data warehouse. Thus poor quality data i.e.; dirty data present in a data mart can be avoided using various data cleaning strategies, and thus leading to more accurate and hence reliable decision making. The quality data can only be produced by cleaning the data and pre-processing it prior to loading it in the data warehouse.As not all the algorithms address the problems related t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 10 publications
0
0
0
Order By: Relevance
“…Detecting anomalies of objects that differ from the main part of the data in the subject areas during data mining [5], there are several approaches to detecting outliers depending on the type of task: parsing, data transformation, methods of enforcing integrity constraints, duplicate detection methods, and others [6]. Almost fully similar works were overviewed by [7] in 2018.…”
Section: Introductionmentioning
confidence: 99%
“…Detecting anomalies of objects that differ from the main part of the data in the subject areas during data mining [5], there are several approaches to detecting outliers depending on the type of task: parsing, data transformation, methods of enforcing integrity constraints, duplicate detection methods, and others [6]. Almost fully similar works were overviewed by [7] in 2018.…”
Section: Introductionmentioning
confidence: 99%