2008 Advanced Software Engineering and Its Applications 2008
DOI: 10.1109/asea.2008.46
|View full text |Cite
|
Sign up to set email alerts
|

An Efficient Method of Data Quality Evaluation Using Metadata Registry

Abstract: This paper proposes MDRDP (Metadata Registry based on Data Profiling) to minimize the time and human resource for analyzing and extracting metadata as criteria standard for data profiling. MDRDP is based on MDR (Metadata Registry) which is used for an international standard of standardizing and managing metadata for information sharing in various fields. By MDRDP, we can evaluate data quality with authorize metadata using methodology of data profiling. MDR can guarantee the quality of metadata so that results … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2013
2013
2017
2017

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 2 publications
0
1
0
Order By: Relevance
“…This should have a high impact on the way applications are built, as developers are greatly interested in programming systems that are reliable, even in the presence of poor quality data. The problem is that although the analysis and improvement of data quality have gathered plenty of attention (e.g., to carry out data cleaning operations) from practitioners and researchers [6,[24][25][26][27][28], and despite the well-known impact of poor quality data in critical data-centric systems [29], understanding how well an application is prepared to handle the inevitable appearance of poor data has been largely overlooked. For this purpose, the identification of representative data quality problems and how they should be integrated in software verification activities (e.g., software testing) is essential.…”
Section: Background and Related Workmentioning
confidence: 99%
“…This should have a high impact on the way applications are built, as developers are greatly interested in programming systems that are reliable, even in the presence of poor quality data. The problem is that although the analysis and improvement of data quality have gathered plenty of attention (e.g., to carry out data cleaning operations) from practitioners and researchers [6,[24][25][26][27][28], and despite the well-known impact of poor quality data in critical data-centric systems [29], understanding how well an application is prepared to handle the inevitable appearance of poor data has been largely overlooked. For this purpose, the identification of representative data quality problems and how they should be integrated in software verification activities (e.g., software testing) is essential.…”
Section: Background and Related Workmentioning
confidence: 99%