The US government is planning to spend hundreds of millions of dollars over the next few years to fund the development of huge supercomputers with power beyond anything available today.
Several government agencies have awarded, or are about to award, contracts for systems capable of sustained petascale computing speeds, which involve quadrillions of calculations per second. To understand the scale of these planned systems, only one system on the latest Top500 supercomputer list surpassed 100 Teraflops.
Earlier this month, Cray signed a US$200 million (NZ$329 million) contract to deliver a petascale-capable system to the US Department of Energy’s Oak Ridge National Laboratory by 2008. That system, based on processors from AMD, will be built in phases of ever-increasing speeds.
The National Science Foundation has begun seeking proposals for a larger high-performance supercomputer that could also cost up to US$200 million. And this month, the Defence Advanced Research Projects Agency (DARPA) plans to award contracts valued at several hundred million dollars for two even larger supercomputers.
The scale of the computing power in the new systems will be so enormous that “we have to change the way we do computational science to really take advantage of these machines”, says Dimitri Kusnezov, head of the Department of Energy’s advanced simulation and computing programme, which operates the most powerful supercomputer in the world today, the IBM Blue Gene/L.
“The question is, what would [scien-tists] do with an infinite amount of computing speed?” Kusnezov says. “What would they calculate? And I’ll wager that they don’t have an answer for you.”
He says petascale computing levels may force researchers to assemble multidisciplinary teams to task the systems with solving fundamental scientific problems.
Three vendors — IBM, Cray and Sun Microsystems — have been working with DARPA for several years on initial research for the next generation of computer systems. The agency is expected to pick two of the three vendors to work on the next phase of the project soon. DARPA’s goal is to build an “economically viable” petascale supercomputer, according to an agency spokeswoman.
The foundation also wants a peta-scale system and is seeking proposals for a supercomputer that can answer key questions about the kinds of abrupt transitions that can occur in the Earth’s climate and ecosystem structure.
Stephen Meacham, IT research programme director at the foundation, says the agency wants to use the system to attack “frontier problems”, such as modelling the interaction of viruses with various components in a cell and looking for ways to block those interactions.
Although much of the focus on supercomputers is on the number of processors being strung together, more vexing problems involve memory, storage subsystems and energy consumption, which “take up a good chunk of the overall cost of the system” says Dave Turek, vice president of deep computing at IBM.