2020
DOI: 10.1016/j.patter.2020.100016
|View full text |Cite
|
Sign up to set email alerts
|

The End-to-End Provenance Project

Abstract: Data provenance is a machine-readable summary of the collection and computational history of a dataset. Data provenance confers or adds value to a dataset, helps reproduce computational analyses, or validates scientific conclusions. The people of the End-to-End Provenance Project are a community of professionals who have developed software tools to collect and use data provenance.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 7 publications
(8 reference statements)
0
4
0
Order By: Relevance
“…Recent end-to-end provenance projects have developed a set of tools, such as R packages, that allow organizations to establish data provenance through enhanced data traceability [24].…”
Section: Demanding Data Traceabilitymentioning
confidence: 99%
“…Recent end-to-end provenance projects have developed a set of tools, such as R packages, that allow organizations to establish data provenance through enhanced data traceability [24].…”
Section: Demanding Data Traceabilitymentioning
confidence: 99%
“…Language-specific solutions, such as noworkflow [57] for python scripts, or rdt for R scripts among others have been developed and offer richer support such as visualisation [20]. Prov-O serves as a fundamental building block for capturing provenance-related data.…”
Section: Fj Ekaputra Et Al / Semantic-enabled Architecture For Auditable Privacy-preserving Data Analysismentioning
confidence: 99%
“…Works focusing on provenance per se such as (Alterovitz et al, 2018;Ellison et al, 2020) and the various workflow provenance systems such as (Khan et al, 2019;Papadimitriou et al, 2021;Yakutovich et al, 2021) are primarily concerned with very detailed documentation of each computation on one or more datasets. The W3C PROV model (Gil et al, 2013;Lebo et al, 2013;Moreau et al, 2013) was developed initially to support interoperability across the transformation logs of workflow systems.…”
Section: Related Workmentioning
confidence: 99%