2011
DOI: 10.1146/annurev-nucl-102010-130059
|View full text |Cite
|
Sign up to set email alerts
|

Computing for the Large Hadron Collider

Abstract: Following the first full year of Large Hadron Collider (LHC) data taking, the Worldwide LHC Computing Grid (WLCG) computing environment built to support LHC data processing and analysis has been validated. In this review, I discuss the rationale for the design of a distributed system and describe how this environment was constructed and deployed through the use of grid computing technologies. I discuss the experience with large-scale testing and operation with real accelerator data, which shows that expectatio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
81
0
1

Year Published

2015
2015
2022
2022

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 135 publications
(82 citation statements)
references
References 1 publication
0
81
0
1
Order By: Relevance
“…The scale increases as more data get recorded, and more simulated events are needed to describe the data with proper statistics. Issues of scale are typically addressed via high throughput computing concepts involving a large number of computing jobs and many computing sites, e.g via the Worldwide LHC Computing Grid (WLCG) [2]. The user is presented with the task to execute the analysis under these conditions.…”
Section: Motivationmentioning
confidence: 99%
“…The scale increases as more data get recorded, and more simulated events are needed to describe the data with proper statistics. Issues of scale are typically addressed via high throughput computing concepts involving a large number of computing jobs and many computing sites, e.g via the Worldwide LHC Computing Grid (WLCG) [2]. The user is presented with the task to execute the analysis under these conditions.…”
Section: Motivationmentioning
confidence: 99%
“…All these efforts, driven without exception by the needs of scientific experiments with particle physics at the forefront of the developments, have led to the creation of the LHC Computing Grid (LCG), which later morphed into the World-wide LHC Computing Grid for the LHC experiments (Bird, 2011). It is a global computing infrastructure whose mission is to provide computing resources to store, distribute and analyse the data generated by the Large Hadron Collider (LHC), making the data equally available to all partners, regardless of their physical location.…”
Section: Computingmentioning
confidence: 99%
“…Towards the end of the data taking year, the whole workflow was repeated in a so-called reprocessing campaign for all of the recorded data with the obtained and improved calibration and alignment measurements together with possible improvements of the software applications. For what concerns the location of data processing, LHCb initially was following the MONARC model [6] and the data processing workflows were executed at CERN (T0) and T1 sites. From this starting point several improvements were done:…”
Section: Pos(isgc2015)005mentioning
confidence: 99%