Climate Feedback

A revolution – for climate model evolution

earthsimulator4.jpg

While most Londoners spent last week maximising their time spent basking in the glorious sunshine that so rarely comes our way, I spent it largely indoors – at a place that predicts these sorts of unusual occurrences, otherwise known as the European Centre for Medium Range Weather Forecasts (ECMWF) in Reading.

The World Modelling Summit for Climate Prediction held there last week was itself something of a rare event – a union of the weather and climate communities, who met to discuss whether – and how – they can eventually provide climate predictions that are as useful, and as useable, as weather forecasts.

The four-day summit culminated in a call for a massive investment, around a billion dollars, to fund a new global research facility or facilities with computer and research resources that would ‘revolutionize’ climate modelling capabilities, bringing into view the holy grail of ‘seamless’ weather-to-climate prediction.

We’ve covered the summit in some detail in this week’s Nature [subscription], in my news story and an editorial by Oliver Morton.

In short, the idea is that an injection of cash on this scale could bring about a quantum leap in climate simulations by funding climate computers far beyond those in use today. Currently, computers used for modelling the climate are in the 10-teraflop range, which means that they operate at inconceivably high speeds and run models that divide the globe into 100 kilometre cells to roughly project how global climate is likely to change in the long-term.

Though these models have had a key role in warning us of the gradual warming of our planet, they’ve fared pretty poorly when it comes to gauging the likelihood of extreme localised events, such as flooding or more frequent hurricanes. But scientists at the conference said that if they had access to supercomputers – with speeds in the range of hundreds of petaflops (basically 10,000 times more processing power) – they could resolve climate globally on the scale of kilometres, potentially creating models good enough to inform nations of the specific regional challenges they can expect in adapting to climate change.


That’s exactly the kind of information needed to inform policy decisions, so one might reasonably expect governments to be throwing cash in this direction. But national climate modelling centres – such as the UK Met Office – simply aren’t getting the funding at the level required to address these challenges. Where to source that money, and what it will ultimately be used for – a global facility or a strengthening of national centres – were topics of intense discussion in Reading.

Whether increasing computing power alone can provide bespoke models is another issue – and one that climate modellers are divided on. But as Oliver Morton writes [also this week in Nature], computer developers are confident they can deliver the advances in hardware to run these high resolution models. [For the technical details, check out their paper in the latest issue of the International Journal of High Performance Computing Applications.]

Whereas Japan’s Earth Simulator was the world’s fastest computer as recently as 2004, there are already 29 faster ones, and the reality is that petaflop computers may be only months away. If funding agencies or donors come good on coughing up the cash, then the biggest challenge may well be delivering the science to keep up with computing capabilities.

Olive Heffernan

Image: Japan’s Earth Simulator in Yokahama. Courtesy of the Earth Simulator Center and JAMSTEC.

Comments

Comments are closed.