2022
DOI: 10.1140/epjc/s10052-022-10469-9
|View full text |Cite
|
Sign up to set email alerts
|

Leveraging universality of jet taggers through transfer learning

Abstract: A significant challenge in the tagging of boosted objects via machine-learning technology is the prohibitive computational cost associated with training sophisticated models. Nevertheless, the universality of QCD suggests that a large amount of the information learnt in the training is common to different physical signals and experimental setups. In this article, we explore the use of transfer learning techniques to develop fast and data-efficient jet taggers that leverage such universality. We consider the gr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 8 publications
(1 citation statement)
references
References 82 publications
0
0
0
Order By: Relevance
“…Increasingly, there are also efforts within the natural sciences to train and exploit general-purpose foundation models [32][33][34][35]. Domain adaptation has been investigated previously in high-energy physics in a jet-tagging contexts [20,36] but to our knowledge not in hierarchical configurations. In parallel to the present effort on supervised backbones, investigations are ongoing on the potential of self-supervised backbones in HEP through masked particle modelling, which extends the masked language modelling approach from NLP to the HEP domain [37].…”
Section: Related Workmentioning
confidence: 99%
“…Increasingly, there are also efforts within the natural sciences to train and exploit general-purpose foundation models [32][33][34][35]. Domain adaptation has been investigated previously in high-energy physics in a jet-tagging contexts [20,36] but to our knowledge not in hierarchical configurations. In parallel to the present effort on supervised backbones, investigations are ongoing on the potential of self-supervised backbones in HEP through masked particle modelling, which extends the masked language modelling approach from NLP to the HEP domain [37].…”
Section: Related Workmentioning
confidence: 99%