2022
DOI: 10.21468/scipostphys.12.6.188
|View full text |Cite
|
Sign up to set email alerts
|

Symmetries, safety, and self-supervision

Abstract: Collider searches face the challenge of defining a representation of high-dimensional data such that physical symmetries are manifest, the discriminating features are retained, and the choice of representation is new-physics agnostic. We introduce JetCLR to solve the mapping from low-level data to optimized observables through self-supervised contrastive learning. As an example, we construct a data representation for top and QCD jets using a permutation-invariant transformer-encoder network and visualize … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
17
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 24 publications
(17 citation statements)
references
References 67 publications
(82 reference statements)
0
17
0
Order By: Relevance
“…Here, we apply power counting to study energy flow polynomials (EFPs) [28], which are an overcomplete linear basis for infrared-and-collinear safe jet substructure. EFPs have been used in a variety of machine learning tasks [29][30][31][32][33][34][35][36], so it is a natural context to study how best to represent jet information. By exploiting power counting, we show how to simplify the EFP basis for analysis tasks involving quark and gluon jets.…”
Section: Jhep09(2022)021mentioning
confidence: 99%
“…Here, we apply power counting to study energy flow polynomials (EFPs) [28], which are an overcomplete linear basis for infrared-and-collinear safe jet substructure. EFPs have been used in a variety of machine learning tasks [29][30][31][32][33][34][35][36], so it is a natural context to study how best to represent jet information. By exploiting power counting, we show how to simplify the EFP basis for analysis tasks involving quark and gluon jets.…”
Section: Jhep09(2022)021mentioning
confidence: 99%
“…Now the network learns how to process high-dimensional correlations in the data, and thus the representations learned by these networks can be very useful for downstream tasks. We introduced the self-supervised JetCLR method in [53] and demonstrated its ability to construct highly expressive representations for classification tasks. In [54] this same technique was used to construct representations for CWoLa-based anomaly detection.…”
Section: Introductionmentioning
confidence: 99%
“…In addition to these works, other self-supervised / representation learning techniques have been applied in particle physics [55,56] and in other scientific disciplines such as astrophysics [57][58][59][60]. In [53,54] the augmentations corresponded to transformations of the event to which the underlying physics should be invariant to rotations or translations, but also soft-collinear parton splittings.…”
Section: Introductionmentioning
confidence: 99%
“…Embedding into a lower-dimensional space can also be seen as an alternate way of building a simpler space to perform physics measurements, searches, and classification, such as in [48]. We bypass the need to do latent variable modeling by directly building the space through embedding with optimal transport distances on the original space, yielding a new handle in how to organize and classify data.…”
Section: Introductionmentioning
confidence: 99%