US reclaims fastest supercomputer title

An almost audible sigh of relief arose from Washington last week as Blue Gene/L, a computer built by IBM, claimed the title of…

An almost audible sigh of relief arose from Washington last week as Blue Gene/L, a computer built by IBM, claimed the title of the world's fastest supercomputer. Science and technology policymakers have spent the past two years fretting that the US was losing its lead in high-performance computing, with potentially serious implications for national competitiveness.

"We believe that to out-compete, we must out-compute," said Ms Deborah Wince-Smith, president of the Council on Competitiveness, one of many lobby groups pressing federal agencies to spend more on supercomputer research.

The lobbying campaign was sparked by the success of the Earth Simulator, a supercomputer built by Japanese electronics group NEC to model climate change. When full details of the Earth Simulator's performance emerged in early 2002 it was clear that Japan had not only overtaken the US in terms of raw computing speed but had done so by a metaphorical mile.

With a sustained performance of 35 teraflops - 35 thousand million operations a second - the machine was then four times faster than its nearest rival.

READ MORE

Equally significant was the way in which this performance had been achieved. The Earth Simulator was the result of a $300 million (€244 million), long-term collaboration between Japan's public and private sectors.

Instead of being built from a "cluster" of off-the-shelf components - the approach favoured by an increasing number of US supercomputers - almost every aspect of the Earth Simulator was custom-built for the project.

"It was a wake-up call," says Mr Dan Reed, director of the Institute for Renaissance Computing at the University of North Carolina, Chapel Hill.

In the 1990s, US government agencies cut back on supercomputing research, which had been driven in earlier decades by the requirements of the nuclear weapons industry.

Researchers turned to clustering, which enabled them to build fast machines that were relatively inexpensive.

The results have been impressive. A cluster of Apple G5 processors built at Virginia Polytechnic Institute emerged last year as one of the world's top 10 supercomputers. The National Centre for Supercomputing Applications at the University of Illinois built a high-performance machine from clusters of chips used in Sony's PlayStation console.

But clusters are not suitable for all applications. The success of the Earth Simulator, designed to run models of world climate over thousands of simulated years, was a reminder that research into alternative designs had been neglected.

Added urgency came from the fact that scientific fields as diverse as drug discovery and astrophysics now rely on the ability to run extremely complex simulations.

Mr Reed argues that computer simulation has become the third pillar supporting scientific discovery, supplementing experiment and theory. Businesses, too, are starting to glimpse a future in which supercomputers could be used to reduce costs and improve productivity.

Mr John Marburger, policy director of the White House office of science and technology policy, told a conference of supercomputer users this summer this summer that "we are approaching a tipping point beyond which entirely new applications of computing will bring a new wave of transformations in our industrial ways of life".

Mr Marburger's speech confirmed that lobbying has put high-performance computing back on the US government's agenda.

The irony is that Blue Gene/L - a hybrid design between custom-built and clustered off-the-shelf approaches - was built without federal money. IBM started work on the design in 1999, long before the power of the Earth Simulator became known, and has since spent about $100 million on the project.

The company hopes to recoup its investment by using the Blue Gene architecture as the basis for a new generation of supercomputers to meet the expected upsurge in demand. IBM also believes that next year it will shatter its new 36 teraflop performance benchmark as Blue Gene/L is expanded. The aim is for 360 teraflops in 2005.

Meanwhile, Japanese scientists are hoping to build on the success of the Earth Simulator, and the EU is backing plans for a "grid" linking computers across Europe.The next milestone is a supercomputer capable of petaflop performance - a million, million operations per second. Such mind-boggling computing power no longer seems like science fiction, and more than national bragging rights are at stake. - (Financial Times Service)