Planck satellite

The Planck satellite has just reached its orbit, 1.5 million km from Earth, on a mission to understand the origin and evolution of our Universe.

Very little is known about the Universe at times close to the Big Bang. Planck may show that our Universe has more than three spatial dimensions, or that it is one of many others, and may even uncover what happened before the Big Bang.

Ariane 5 was launched successfully by the European Space Agency (ESA) on the 14 May 2009 from French Guiana. Aboard the rocket were the Planck and Herschel Observatories. Worth a combined £1.7 billion, these are two of the most expensive scientific satellites ever built by ESA. Within a few hours, the satellites had separated, and the Planck satellite (named after the German Nobel Laureate Max Planck) began the journey to its current orbit 1.5 million km from Earth in the opposite direction to the Sun. For the next two years, the satellite will complete a full scan of the whole sky every six months.

Researchers at the Cambridge Planck Analysis Centre (CPAC), which is spread between the Institute of Astronomy, the Kavli Institute for Cosmology, the Department of Physics, and the Department of Applied Mathematics and Theoretical Physics, are working in partnership with over 40 institutes as part of the pan-European Planck collaboration. Professor George Efstathiou, Director of the Kavli Institute, has been part of the project since its inception in 1993, and is one of the 10-member international Planck Science Team whose task is to monitor and direct the Planck satellite’s scientific programme.

The ultimate questions

The Planck satellite has been designed to answer some of the most important questions of modern science – how did the Universe begin, how did it evolve to the state we observe today, and how will it evolve in the future? The satellite is equipped with powerful microwave detectors chilled to close to absolute zero, and its objective is to provide a major source of information to test theories of the early Universe and the origin of cosmic structure.

Cosmologists have a much clearer picture of the Universe than they did 15 years ago. According to current understanding, the Universe is spatially flat, about 13.7 billion years old and is mostly composed of mysterious ‘dark energy’ (an energy form that is thought to account for the Universe’s accelerating expansion). Much of this information has come from studying the remnants of the radiation that filled the Universe immediately after the Big Bang, which we observe today as the cosmic microwave background (CMB). But cosmologists believe that only a fraction of the information has been extracted so far, limited by the sensitivity of detectors used to study it.

This is where Planck comes in. By carrying highly sensitive detectors, Planck is designed to measure tiny temperature fluctuations in the CMB (often called ‘CMB anisotropies’) with the highest accuracy ever achieved. These fluctuations are thought to have been generated within 10−35seconds of the Big Bang. At these early times, the billions of galaxies that we see today would fit into a volume about the size of a grapefruit.

Very little is known about the Universe at times close to the Big Bang. Planck may show that our Universe has more than three spatial dimensions, or that it is one of many others, and may even uncover what happened before the Big Bang.

Probing the data

By the end of the two-year mission, Planck will have extracted a wealth of cosmological information from the CMB. One of many vital mission components is the data analysis and simulation software, which is being developed by scientists at a number of European institutions including CPAC. Cambridge researchers will be actively involved in the scientific interpretation of the data sent back from Planck and will also be responsible for providing a catalogue of the many hundreds of galaxy clusters within the Universe.

To do this requires substantial computing power and the University’s Darwin supercomputer in the High-Performance Computing Service will be essential for the analysis of the terabytes of data streamed back to Earth from Planck (see below).

The goals for the Planck satellite are ambitious but the returns could be spectacular. Planck will establish real facts where once there were unknowns and has the potential to uncover completely unanticipated phenomena that could revolutionise our understanding of physics.

For more information, please contact Professor George Efstathiou (, Director of the Kavli Institute for Cosmology. The Cambridge contribution to the Planck consortium has been funded by the Science and Technology Facilities Council (STFC).


Cambridge supercomputing

Darwin, the University of Cambridge’s supercomputer, has the power to process hundreds of terabytes of raw data in a matter of weeks.

Cambridge’s High-Performance Computing Service (HPCS) is home to a 20-tonne supercomputer called Darwin. Comprising 585 Dell servers and 2,340 processor cores, Darwin is one of the fastest computers in the UK, and a forthcoming upgrade will increase this yet further, taking the processing power from 20 to 30 teraflops.

Darwin is a central resource that is open to all research staff within the University. It processes complex simulations for a wide variety of research projects: from simulating crack propagation in materials modelling and analysing air flow over turbine blades, to interrogating raw data sent back by the Planck satellite.
Value for money

The HPCS is a self-sustaining cost centre, whereby users are charged at point of use of the system and the service costs are covered by a combination of industrial sponsorship and academic income from research grants. The service has been running this model for just over two years and is now almost entirely self-sustainable. Dr Paul Calleja, Director of the HPCS, explained the significance of this attribute: ‘This is an important achievement because it underpins the viability of the service for future years even with hard economic times ahead within the public sector.’

The cost model and overall service delivery strategy of the HPCS has been successfully constructed with value for money as one of the primary goals. ‘This is a key reason why many groups and departments within the University are looking to outsource their high-performance computing and research computing requirements to the HPCS,’ said Dr Calleja. ‘Not only does the facility maximise the resources available to staff but it also reduces the overall cost base of departments.’

Free usage of the system is also supported by the HPCS, both to allow new users to develop their skills and to enable pre-existing users a level of continuity if they fall between grant lines. Moving forward, the HPCS will always ensure that there is adequate free access to the system within the cost model but will link free access to grant-writing activity to guarantee that the entire user community is helping to underpin the financial stability of the service.
Research productivity

The HPCS offers a wide range of general research computing services and support options that map well to the varied workflow processes required by different projects; such flexibility and range of services greatly increase the productivity and output of its users.

Facilitating new and world-class computational science for research purposes is the primary mission of the HPCS, as amply demonstrated by the wide range of research publications being produced by its user community. For users such as the Cambridge Planck Analysis Centre, the kinds of cosmological breakthroughs anticipated with the Planck satellite would be impossible to achieve without this level of supercomputing power.

For more information, please contact the HPCS Account Manager, Kamila Lembrych ( or visit

This work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page.