Standard Article

Managing the Extreme Dataflow of LHC Experiments

  1. Niko Neufeld1,
  2. Dietrich Liko2

Published Online: 15 OCT 2009

DOI: 10.1002/3527600434.eap698

Encyclopedia of Applied Physics

Encyclopedia of Applied Physics

How to Cite

Neufeld, N. and Liko, D. 2009. Managing the Extreme Dataflow of LHC Experiments. Encyclopedia of Applied Physics. 797–822.

Author Information

  1. 1

    European Organization for Nuclear Research, CERN, Switzerland

  2. 2

    Austrian Academy of Sciences, Institute for High Energy Physics, Vienna, Austria

Publication History

  1. Published Online: 15 OCT 2009


The Large Hardron Collider (LHC) will extend our knowledge of particles and their interactions to unprecedented energies. The physicists hope to obtain answers on questions ranging from the origin of mass to the nature of dark matter. The high interactions rates and the large number of detector channels result in an enormous amount of data produced in every second. New developments in the area of data acquisition have been necessary to be able to transport and filter these data with complex trigger systems. The selected data will be then recorded, resulting in datasets reaching several Petabytes. The resource requirements for the analysis of these data poses also new challenges in the offline analysis. A global computing infrastructure, the grid, has been developed to interconnect computing centers all over the world to provide these resources and to analyze the data. The article reviews these new computing challenges for LHC covering both the online and the offline aspects.


  • high-energy physics;
  • Large Hadron Collider;
  • data acquisition;
  • high-level trigger;
  • grid computing;
  • data analysis