2020
DOI: 10.1007/978-981-15-3380-8_4
|View full text |Cite
|
Sign up to set email alerts
|

Exploiting CBOW and LSTM Models to Generate Trace Representation for Process Mining

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
1
1
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 7 publications
0
3
0
Order By: Relevance
“…This vast research area called predictive business process monitoring, attracted several literature reviews (e.g., Neu et al (2021); Harane and Rathi (2020)). ML can also be used to optimise existing processes Fernandes et al (2019) or to get a compact representation of traces Bui et al (2019Bui et al ( , 2020. Recently, there has been interest in the interpretability of RNNs models, specifically in a process mining context Hanga et al (2020).…”
Section: Machine Learning For Process Monitoring and Miningmentioning
confidence: 99%
“…This vast research area called predictive business process monitoring, attracted several literature reviews (e.g., Neu et al (2021); Harane and Rathi (2020)). ML can also be used to optimise existing processes Fernandes et al (2019) or to get a compact representation of traces Bui et al (2019Bui et al ( , 2020. Recently, there has been interest in the interpretability of RNNs models, specifically in a process mining context Hanga et al (2020).…”
Section: Machine Learning For Process Monitoring and Miningmentioning
confidence: 99%
“…De Koninck et al [4] transferred the idea of Word2vec [8] and Doc2vec [7] to process data. An LSTM and CBOW-based approach was introduced by Bui et al [2]. A supervised representation learning approach based on conditional random fields for event abstraction was introduced by Tax et al [14].…”
Section: Related Workmentioning
confidence: 99%
“…In our parameter optimization strategy, we first optimize the vector size. We vary the vector size of the hidden and the embedding layer (2,3,4,8,16,32,64,128,256), and the number of epochs (10,25,50). Next, we optimize the window size of the embedding which determines how many activities before and after the current activity are considered.…”
Section: Real-life Event Logs: Trace Clusteringmentioning
confidence: 99%