2020
DOI: 10.48550/arxiv.2005.01194
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

An empirical comparison of deep-neural-network architectures for next activity prediction using context-enriched process event logs

Abstract: Researchers have proposed a variety of predictive business process monitoring (PBPM) techniques aiming to predict future process behaviour during the process execution. Especially, techniques for the next activity prediction anticipate great potential in improving operational business processes. To gain more accurate predictions, a plethora of these techniques rely on deep neural networks (DNNs) and consider information about the context, in which the process is running. However, an in-depth comparison of such… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(6 citation statements)
references
References 43 publications
0
6
0
Order By: Relevance
“…Bohmer et al [3] proposed combining local and global techniques using sequential prediction rules for the next event prediction. For further relevant work on PBPM, an interested reader may refer to [28] and [39] survey studies.…”
Section: Predictive Process Monitoringmentioning
confidence: 99%
“…Bohmer et al [3] proposed combining local and global techniques using sequential prediction rules for the next event prediction. For further relevant work on PBPM, an interested reader may refer to [28] and [39] survey studies.…”
Section: Predictive Process Monitoringmentioning
confidence: 99%
“…Most of the DNN architectures proposed for the next activity and timestamp prediction in PBPM [17] use "vanilla" LSTM cells [9]. LSTMs belong to the class of recurrent neural networks [11] and are designed to handle temporal dependencies in sequential prediction problems [3].…”
Section: Long Short-term Memory Cellsmentioning
confidence: 99%
“…To predict next activities, we use a "vanilla", i.e. basic, long short-term memory network (LSTM) [7] because most of the PBPM techniques for predicting next activities rely on this DNN architecture [24]. LSTMs belong to the class of recurrent neural networks (RNNs) [8] and are designed to handle temporal dependencies in sequential prediction problems [1].…”
Section: Long Short-term Memory Neural Networkmentioning
confidence: 99%
“…Many of these techniques are geared to address the next activity prediction task. For that, most of the recent techniques rely on LSTMs [24] such as Weinzierl et al [25]. To predict not only the next activities with a single predictive model, Tax et al [20] suggest a multi-task LSTM-based DNN architecture.…”
Section: Related Workmentioning
confidence: 99%