Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining 2022
DOI: 10.1145/3534678.3539138
|View full text |Cite
|
Sign up to set email alerts
|

Seq2Event: Learning the Language of Soccer Using Transformer-based Match Event Prediction

Abstract: Soccer is a sport characterised by open and dynamic play, with player actions and roles aligned according to team strategies simultaneously and at multiple temporal scales with high spatial freedom. This complexity presents an analytics challenge, which to date has largely been solved by decomposing the game according to specific criteria to analyse specific problems. We propose a more holistic approach, utilising Transformer or RNN components in the novel Seq2Event model, in which the next match event is pred… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
23
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 11 publications
(23 citation statements)
references
References 16 publications
0
23
0
Order By: Relevance
“…The feasible solution from the NTPP framework [23] is to encode the information from the history events information ( y 1 , y 2 , ... y i−1 ), y i = [t i , m i , z i ] into a fixed-size single vector h i with LSTM [10], GRU [3], or Transformer Encoder [31]. Based on a previous study [25], Transformer Encoder has been found to be slightly less effective, but significantly more efficient than LSTM. Therefore, in this study, we applied Transformer Encoder to encode the history events information.…”
Section: Define Football Event Data As Nmstppmentioning
confidence: 99%
See 4 more Smart Citations
“…The feasible solution from the NTPP framework [23] is to encode the information from the history events information ( y 1 , y 2 , ... y i−1 ), y i = [t i , m i , z i ] into a fixed-size single vector h i with LSTM [10], GRU [3], or Transformer Encoder [31]. Based on a previous study [25], Transformer Encoder has been found to be slightly less effective, but significantly more efficient than LSTM. Therefore, in this study, we applied Transformer Encoder to encode the history events information.…”
Section: Define Football Event Data As Nmstppmentioning
confidence: 99%
“…Stage 2: History encoding. In this stage, a dense layer is first applied to interevent time t i and other continuous features, with an embedding layer applied to zone z i and action m i respectively, allowing the model to better capture information in the features [25]. Afterward, with the position encoding and transformer encoder from the Transformer model [31] (more details on Appendix D), a fixed-size encoded history vector with size (31) can be retrieved.…”
Section: Nmstpp Model Architecturementioning
confidence: 99%
See 3 more Smart Citations