Managing the Extreme Dataflow of LHC Experiments
Published Online: 15 OCT 2009
Copyright © 2003 by Wiley-VCH Verlag GmbH & Co. KGaA. All rights reserved.
Encyclopedia of Applied Physics
How to Cite
Neufeld, N. and Liko, D. 2009. Managing the Extreme Dataflow of LHC Experiments. Encyclopedia of Applied Physics. 797–822.
- Published Online: 15 OCT 2009
The Large Hardron Collider (LHC) will extend our knowledge of particles and their interactions to unprecedented energies. The physicists hope to obtain answers on questions ranging from the origin of mass to the nature of dark matter. The high interactions rates and the large number of detector channels result in an enormous amount of data produced in every second. New developments in the area of data acquisition have been necessary to be able to transport and filter these data with complex trigger systems. The selected data will be then recorded, resulting in datasets reaching several Petabytes. The resource requirements for the analysis of these data poses also new challenges in the offline analysis. A global computing infrastructure, the grid, has been developed to interconnect computing centers all over the world to provide these resources and to analyze the data. The article reviews these new computing challenges for LHC covering both the online and the offline aspects.
- high-energy physics;
- Large Hadron Collider;
- data acquisition;
- high-level trigger;
- grid computing;
- data analysis