Guide to the Large Hadron Collider

Flash plug-in required

To view the advanced features of this page you need to have the Adobe Flash 7 or higher plugin installed on your system.

Click here to download the Flash plugin from the Adobe website

The Grid

David Shukman explains the LHC's data handling


1 of 6

The LHC will generate huge amounts of data, with nearly 150 million sensors picking up information from millions of particle collisions every second at the centre of each of the four main detectors.

This will produce around half a gigabyte of data every second, or around 15 petabytes (15 million GB) every year, equivalent to filling a standard 100GB hard drive every four minutes.

To cope, a specialist LHC Computing Grid (LCG) has been built.

The number crunching starts at the detector. Each of the four main experiments has an attached "counting room" which is used to sort through the raw data and store anything of interest.

Batches of data are then sent to the Cern computing centre where they are backed-up on tape. A processor farm begins the huge task of crunching the data to create an event summary, also backed up on to tape.

From Switzerland, the data is pumped out across the net to 11 so-called "tier one" centres such as the Rutherford Appleton Laboratory in the UK, each directly connected to Cern through dedicated cables.

Each centre reprocesses the raw data, creates another back up and then pumps it out to 150 "tier two" centres, mainly universities located around the globe.

From here, the information will be available to around 7,000 physicists who will perform the final simulations and detailed analysis.

Has China's housing bubble burst?
How the world's oldest clove tree defied an empire
Why Royal Ballet principal Sergei Polunin quit


Americas Africa Europe Middle East South Asia Asia Pacific