2017
DOI: 10.1177/1094342017704893
|View full text |Cite
|
Sign up to set email alerts
|

The future of scientific workflows

Abstract: Today's computational, experimental, and observational sciences rely on computations that involve many related tasks. The success of a scientific mission often hinges on the computer automation of these workflows. In April 2015, the US Department of Energy (DOE) invited a diverse group of domain and computer scientists from national laboratories supported by the Office of Science, the National Nuclear Security Administration, from industry, and from academia to review the workflow requirements of DOE's science… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
101
0
2

Year Published

2018
2018
2022
2022

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 133 publications
(103 citation statements)
references
References 80 publications
(92 reference statements)
0
101
0
2
Order By: Relevance
“…By the way, different proposals and implementations of techniques are proposed to improve the performance of systems that deal with Big Data challenge [8,28,44]. For example, except transfer learning used in the present paper some promising learning methods in recent studies, such as representation learning, deep learning, distributed and parallel learning, active learning, and kernel-based learning are proposed to deal with Big Data [28].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…By the way, different proposals and implementations of techniques are proposed to improve the performance of systems that deal with Big Data challenge [8,28,44]. For example, except transfer learning used in the present paper some promising learning methods in recent studies, such as representation learning, deep learning, distributed and parallel learning, active learning, and kernel-based learning are proposed to deal with Big Data [28].…”
Section: Discussionmentioning
confidence: 99%
“…For example, except transfer learning used in the present paper some promising learning methods in recent studies, such as representation learning, deep learning, distributed and parallel learning, active learning, and kernel-based learning are proposed to deal with Big Data [28]. An example of dealing with big data in workflow management systems is [44] which scales workflows to High Performance Computing (HPC) supercomputing systems. Another solution remains in workflow representations; one can encode workflows for workflow indexing to further reduce processing time and scale similarity search to sizes of current repositories [17].…”
Section: Discussionmentioning
confidence: 99%
“…More recently, Mattoso et al [17] surveyed the use of steering in the context of HPC scientific workflows highlighting a tighter integration between the user and the underlying workflow execution system. Workflow management systems [18], [19], [20], [21] are also relevant to help users in complex scientific experiments. Deelman et al [19] developed a taxonomy of e-Science systems so scientists can assess the suitability of workflow systems to their experiments.…”
Section: Related Workmentioning
confidence: 99%
“…Middleware like CORBA can also benefit from the workflows for doing their work [19]. But for workflow management it is necessary to consider the volume of data has become big, and so the management should be capable to work well with Big Data [4], [6]. On the one hand, this system should be responsive to the big volume of the data and, on the other hand, it should have low cost.…”
Section: Workflow Management Systemmentioning
confidence: 99%
“…On the other hand, if the process is automated by workflows, data sources are selected during the run of workflow and the parameters are set by the user. Workflows may have resource scheduling in High Performance Computing (HPC [4]) (for example in local computer clusters), or time scheduling including remote resources (Grid computing [5] or Cloud [6]). Data may also be stepped i.e., from certain places waiting for data, where computational tasks are done on HPC clusters.…”
Section: Introductionmentioning
confidence: 99%