“…It involves defining data standards and implementing data quality controls, including data collection, cleansing, integration and maintenance (Anil and Satish, 2019). Organizations need to designate data stewards who are responsible for ensuring data quality, resolving data-related issues and promoting data governance practices within the organization (Caballero et al , 2022).…”
Section: The Strategic Imperatives Of Data Assetization (Rq3)mentioning
confidence: 99%
“…According to Caballero et al (2022), organizations have to implement effective data quality management measures to ensure data reliability and trustworthiness. It involves defining data standards and implementing data quality controls, including data collection, cleansing, integration and maintenance (Anil and Satish, 2019).…”
Section: The Preparation Of the Resourcementioning
Purpose
The purpose of this paper is to explore the concept of data assets and how companies can assetize their data. Using the literature review methodology, the paper first summarizes the conceptual controversies over data assets in the existing literature. Subsequently, the paper defines the concept of data assets. Finally, keywords from the existing research literature are presented visually and a foundational framework for achieving data assetization is proposed.
Design/methodology/approach
This paper uses a systematic literature review approach to discuss the conceptual evolution and strategic imperatives of data assets. To establish a robust research methodology, this paper takes into account two main aspects. First, it conducts a comprehensive review of the existing literature on digital technology and data assets, which enables the derivation of an evolutionary path of data assets and the development of a clear and concise definition of the concept. Second, the paper uses Citespace, a widely used software for literature review, to examine the research framework of enterprise data assetization.
Findings
The paper offers pivotal insights into the realm of data assets. It highlights the changing perceptions of data assets with digital progression and addresses debates on data asset categorization, value attributes and ownership. The study introduces a definitive concept of data assets as electronically recorded data resources with real or potential value under legal parameters. Moreover, it delineates strategic imperatives for harnessing data assets, presenting a practical framework that charts the stages of “resource readiness, capacity building, and data application”, guiding businesses in optimizing their data throughout its lifecycle.
Originality/value
This paper comprehensively explores the issue of data assets, clarifying controversial concepts and categorizations and bridging gaps in the existing literature. The paper introduces a clear conceptualization of data assets, bridging the gap between academia and practice. In addition, the study proposes a strategic framework for data assetization. This study not only helps to promote a unified understanding among academics and professionals but also helps businesses to understand the process of data assetization.
“…It involves defining data standards and implementing data quality controls, including data collection, cleansing, integration and maintenance (Anil and Satish, 2019). Organizations need to designate data stewards who are responsible for ensuring data quality, resolving data-related issues and promoting data governance practices within the organization (Caballero et al , 2022).…”
Section: The Strategic Imperatives Of Data Assetization (Rq3)mentioning
confidence: 99%
“…According to Caballero et al (2022), organizations have to implement effective data quality management measures to ensure data reliability and trustworthiness. It involves defining data standards and implementing data quality controls, including data collection, cleansing, integration and maintenance (Anil and Satish, 2019).…”
Section: The Preparation Of the Resourcementioning
Purpose
The purpose of this paper is to explore the concept of data assets and how companies can assetize their data. Using the literature review methodology, the paper first summarizes the conceptual controversies over data assets in the existing literature. Subsequently, the paper defines the concept of data assets. Finally, keywords from the existing research literature are presented visually and a foundational framework for achieving data assetization is proposed.
Design/methodology/approach
This paper uses a systematic literature review approach to discuss the conceptual evolution and strategic imperatives of data assets. To establish a robust research methodology, this paper takes into account two main aspects. First, it conducts a comprehensive review of the existing literature on digital technology and data assets, which enables the derivation of an evolutionary path of data assets and the development of a clear and concise definition of the concept. Second, the paper uses Citespace, a widely used software for literature review, to examine the research framework of enterprise data assetization.
Findings
The paper offers pivotal insights into the realm of data assets. It highlights the changing perceptions of data assets with digital progression and addresses debates on data asset categorization, value attributes and ownership. The study introduces a definitive concept of data assets as electronically recorded data resources with real or potential value under legal parameters. Moreover, it delineates strategic imperatives for harnessing data assets, presenting a practical framework that charts the stages of “resource readiness, capacity building, and data application”, guiding businesses in optimizing their data throughout its lifecycle.
Originality/value
This paper comprehensively explores the issue of data assets, clarifying controversial concepts and categorizations and bridging gaps in the existing literature. The paper introduces a clear conceptualization of data assets, bridging the gap between academia and practice. In addition, the study proposes a strategic framework for data assetization. This study not only helps to promote a unified understanding among academics and professionals but also helps businesses to understand the process of data assetization.
“…Evaluating data quality properties requires identifying, validating, and grouping the business rules (Caballero et al, 2022). For the interest of this investigation, we considered parts 100 to 150 of the ISO 8000-100 series as the primary source of business rules.…”
Section: General Overview Of Iso 8000-100 Seriesmentioning
confidence: 99%
“…Following this process, business rules were inferred (see Table 5). These business rules have been grouped for the various characteristics and properties (see Table 6) following the BR4DQ methodology (Caballero et al, 2022).…”
Section: Business Rules For Master Data Inferred From Iso 8000-100 Se...mentioning
Master data has been revealed as one of the most potent instruments to guarantee adequate levels of data quality. The main contribution of this paper is a data quality model to guide repeatable and homogeneous evaluations of the level of data quality of master data repositories. This data quality model follows several international open standards: ISO/IEC 25012, ISO/IEC 25024, and ISO 8000-1000, enabling compliance certification. A case study of applying the data quality model to an organizational master data repository has been carried out to demonstrate the applicability of the data quality model.
“…Loshin et al,[ 24 ] asserted that, when resource requirements surpass the capabilities of the current data technology environment, big data are subject to cost-effective solutions to handle current and future business difficulties. Moreover, big data is a comprehensive term that includes the dataset itself, along with technical, medical, commercial, and spatial value issues [ 23 , 25 , 26 ]. Big data are useful primarily because it extracts significant business value from data.…”
For decision-making support and evidence based on healthcare, high quality data are crucial, particularly if the emphasized knowledge is lacking. For public health practitioners and researchers, the reporting of COVID-19 data need to be accurate and easily available. Each nation has a system in place for reporting COVID-19 data, albeit these systems’ efficacy has not been thoroughly evaluated. However, the current COVID-19 pandemic has shown widespread flaws in data quality. We propose a data quality model (canonical data model, four adequacy levels, and Benford’s law) to assess the quality issue of COVID-19 data reporting carried out by the World Health Organization (WHO) in the six Central African Economic and Monitory Community (CEMAC) region countries between March 6,2020, and June 22, 2022, and suggest potential solutions. These levels of data quality sufficiency can be interpreted as dependability indicators and sufficiency of Big Dataset inspection. This model effectively identified the quality of the entry data for big dataset analytics. The future development of this model requires scholars and institutions from all sectors to deepen their understanding of its core concepts, improve integration with other data processing technologies, and broaden the scope of its applications.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.