LHC

The world's most expensive scientific instrument will be ready for full experiments in 2009; Andy Parker describes Cambridge's role in constructing and using the machine that hopes to understand the universe.

Hypotheses will be confirmed or dashed, exotic new particles and dimensions may be found; and, if we’re lucky, we may discover something that not even the theorists have predicted.

The Large Hadron Collider (LHC) is becoming a rare example of a scientific project that needs no introduction – the recent publicity when the machine was turned on has ensured a celebrity rating as high as the Hubble Space Telescope, perhaps even rivalling the latest reality TV show winners! But few perhaps understand the scale of the challenges posed by the project. Accelerating beams of protons to within a whisker of the speed of light, around a tunnel the size of London Underground’s Circle Line, in a vacuum as empty as intergalactic space and colliding them together every 25 nanoseconds to re-create the conditions just after the universe was formed is just the start…

Hunting for Higgs

The scientific goals of the LHC are extremely ambitious – no less than the untangling of some of the longest standing mysteries of physics relating to the innermost structure of matter. One of the key endpoints is to determine, finally, whether the mass of fundamental particles is due to the long-sought-after Higgs mechanism. According to the Standard Model of particle physics, the Higgs mechanism explains how mass-less particles acquire mass and, for the theory to be correct, the hypothetical Higgs boson particle must be shown to exist.

The aim also is to search for phenomena beyond the Standard Model: seeking to explain why the universe contains so little antimatter; searching for supersymmetric particles, which may be a source of dark matter; and even exploring whether we live in a universe with hidden dimensions. With these advances, it may even be possible to create and study tiny black holes within the laboratory.

An extraordinary collaboration

The figures associated with the design, build and implementation of the LHC are truly spectacular: involving an estimated 3000 research scientists, at 300 universities, spread across 111 nations and with an approximate budget of £2.6 billion, it is officially the world’s largest experiment.

Cambridge has been involved from the beginning, when Professor Andy Parker, now head of the High Energy Physics Group at the Cavendish Laboratory, attended the first meeting in 1989. Since then, with funding from the Science and Technology Facilities Council (STFC), a technical team has had responsibility for designing the silicon sensors used to measure particles emerging from the collisions. The team, led by Professor Janet Carter until September 2008 and now led by Professor Parker, also designed and built some of the complex electronics needed to read out the detectors, and is responsible for the software used to get the data safely recorded.

Data deluge

On 10 September 2008, the Cambridge physicists, along with a huge global audience, awaited the switching on of the particle beams at CERN (the European research laboratory in Switzerland) and the first successful attempt to bring protons around the full circumference. Within a few days, the first collisions had been successfully recorded; however, a faulty electrical connection between two of the accelerator’s magnets then temporarily halted further beams. The accelerator will be restarted in 2009 and Cambridge has been preparing for the deluge of data expected to flow out from the LHC.

Current estimates rate the data flow as fast enough to fill a DVD every 2.5 seconds. To absorb this tsunami of bytes, computers around the world have been linked into a ‘Grid’ of over 100,000 processors to crunch the data into meaningful information for the scientists to inspect. Data will cascade from the ‘Tier 0’ centre at CERN to 11 ‘Tier 1’ centres based at national supercomputer institutes, and from there to 140 ‘Tier 2’ centres, grouped into 38 federations covering 32 countries and hosted by universities like Cambridge. Users can submit their jobs to analyse the data to their local computers and the Grid software will automatically send the work to a computer anywhere in the world that can access the data needed and return the results. The user has seamless access to all the processors and a distributed datastore capable of holding 15 million gigabytes each year for the 15-year lifetime of the project. The Grid facility in Cambridge, known as CamGrid, is providing a powerful computational tool for university members not just in high-energy physics but also in many other departments.

To infinity and beyond

It is difficult to predict what will be discovered when the protons start colliding again at the LHC. But what is certain is that the results will transform our fundamental understanding of the universe. Hypotheses will be confirmed or dashed, exotic new particles and dimensions may be found; and, if we’re lucky, we may discover something that not even the theorists have predicted.

For more information, please contact the author Professor Andy Parker (parker@hep.phy.cam.ac.uk) at the Cavendish Laboratory, Department of Physics.

 


Centre for Scientific Computing and CamGrid

Centre for Scientific Computing
High-level computing of the sort required to analyse data created by the LHC is embedded in many research activities in the physical and biomedical sciences. It might be driven by the ability to collect enormous datasets, such as for the LHC, or by a requirement to analyse huge gene banks or medical records as quickly as possible, or to process data delivered from satellites or environmental sensors.
In Cambridge, high-level computing resources have been brought together under the Centre for Scientific Computing (CSC) with the goals of linking research projects in diverse disciplines, encouraging the sharing of resources, consolidating intellectual activities and transferring skills.

CSC is a federated initiative that encompasses the High-Performance Computing Service (HPCS), which is one of the largest academic supercomputers in the UK; the eScience Centre, which supports eScience projects involving scientists and industry using Grid-enabled applications (e.g. CamGrid) in the Cambridge region; and teaching at the MPhil level.

For more information, please contact CSC Director Professor Mike Payne (mcp1 AT cam DOT ac DOT uk).

CamGrid

CamGrid is a distributed computing resource in which the processing capacity of desktops or dedicated machines across the University is put to use at times when the machines lie idle. In operation since 2005, CamGrid now links spare capacity equivalent to approximately 1000 machines in 12 different departments.

As well as ramping up for the LHC, the data-sharing and massive processing capability of CamGrid has been used in multiple ways in Cambridge: by the Unilever Centre to explore the structure of molecules; by the Department of Earth Sciences to determine the properties of materials under the enormous pressures and temperatures at the Earth’s core; and by the Department of Pathology for phylogenetic analyses to address how eukaryotic cells evolved from their prokaryotic ancestors.

For more information, please contact eScience Centre Director Mark Hayes (mah1002 AT cam DOT ac DOT uk).


This work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page.