Results are presented from searches for the standard model Higgs boson in proton-proton collisions at root s = 7 and 8 TeV in the Compact Muon Solenoid experiment at the LHC, using data samples corresponding to integrated luminosities of up to 5.1 fb(-1) at 7 TeV and 5.3 fb(-1) at 8 TeV. The search is performed in five decay modes: gamma gamma, ZZ, W+W-, tau(+)tau(-), and b (b) over bar. An excess of events is observed above the expected background, with a local significance of 5.0 standard deviations, at a mass near 125 GeV, signalling the production of a new particle. The expected significance for a standard model Higgs boson of that mass is 5.8 standard deviations. The excess is most significant in the two decay modes with the best mass resolution, gamma gamma and ZZ; a fit to these signals gives a mass of 125.3 +/- 0.4(stat.) +/- 0.5(syst.) GeV. The decay to two photons indicates that the new particle is a boson with spin different from one. (C) 2012 CERN. Published by Elsevier B.V. All rights reserved
By using the ATLAS detector, observations have been made of a centrality-dependent dijet asymmetry in the collisions of lead ions at the Large Hadron Collider. In a sample of lead-lead events with a per-nucleon center of mass energy of 2.76 TeV, selected with a minimum bias trigger, jets are reconstructed in fine-grained, longitudinally segmented electromagnetic and hadronic calorimeters. The transverse energies of dijets in opposite hemispheres are observed to become systematically more unbalanced with increasing event centrality leading to a large number of events which contain highly asymmetric dijets. This is the first observation of an enhancement of events with such large dijet asymmetries, not observed in proton-proton collisions, which may point to an interpretation in terms of strong jet energy loss in a hot, dense medium.
Previously published and as yet unpublished QCD results obtained with the ALEPH detector at LEP1 are presented. The unprecedented statistics allows detailed studies of both perturbative and non-perturbative aspects of strong interactions to be carried out using hadronic Z and tau decays. The studies presented include precise determinations of the strong coupling constant, tests of its avour independence, tests of the SU(3) gauge structure of QCD, study of coherence eects, and measurements of single-particle inclusive distributions and two-particle correlations for many identied baryons and mesons.
ROOT is an object-oriented C++ framework conceived in the high-energy physics (HEP) community, designed for storing and analyzing petabytes of data in an efficient way. Any instance of a C++ class can be stored into a ROOT file in a machine-independent compressed binary format. In ROOT the TTree object container is optimized for statistical data analysis over very large data sets by using vertical data storage techniques. These containers can span a large number of files on local disks, the web, or a number of different shared file systems. In order to analyze this data, the user can chose out of a wide set of mathematical and statistical functions, including linear algebra classes, numerical algorithms such as integration and minimization, and various methods for performing regression analysis (fitting). In particular, the RooFit package allows the user to perform complex data modeling and fitting while the RooStats library provides abstractions and implementations for advanced statistical tools. Multivariate classification methods based on machine learning techniques are available via the TMVA package. A central piece in these analysis tools are the histogram classes which provide binning of one-and multi-dimensional data. Results can be saved in high-quality graphical formats like Postscript and PDF or in bitmap formats like JPG or GIF. The result can also be stored into ROOT macros that allow a full recreation and rework of the graphics. Users typically create their analysis macros step by step, making use of the interactive C++ interpreter CINT, while running over small data samples. Once the development is finished, they can run these macros at full compiled speed over large data sets, using onthe-fly compilation, or by creating a stand-alone batch program. Finally, if processing farms are available, the user can reduce the execution time of intrinsically parallel tasks -e.g. data mining in HEP -by using PROOF, which will take care of optimally distributing the work over the available resources in a transparent way. Antcheva et al. / Computer Physics Communications 180 (2009) [2499][2500][2501][2502][2503][2504][2505][2506][2507][2508][2509][2510][2511][2512]
Program summary
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.