BBC News
watch One-Minute World News
Last Updated: Sunday, 17 April 2005, 14:32 GMT 15:32 UK
Supercomputing power made real
By Jo Twist
BBC News science and technology reporter

Volvo safety simulations
Volvo uses supercomputing power to simulate crashes
For computer users and gadget lovers, Moore's Law has been a welcome standard because it has meant technology keeps doing more for the same money or less.

Intel co-founder Gordon Moore's prediction 40 years ago in a 1965 research paper said that integrated circuits - the elements that make chips work - would grow more complex.

For the last four decades, it has been a "law", the industry's benchmark, driving chip performance. It has meant that every couple of years, chip performance has doubled.

But for supercomputers which rely on thousands of processors operating at once, a faster individual chip has not necessarily been a priority.

Power savers

The fastest computer in the world now is an IBM Blue Gene system. It recently smashed its own record, clocking up 135.5 teraflops - 135.5 trillion calculations a second.

The record-breaking system is being constructed for the Lawrence Livermore National Laboratory, a US Department of Energy (DOE) lab.

"In the computing business, there has been this drive to always increase the frequency of chips - the faster the frequency, the faster the cycle, the faster the chip," Professor Bill Pulleyblank explained to the BBC News website.

Until recently, the professor was one of the brains and the project leader of Blue Gene.

If you look at our industry, part of it is innovation and a large part is about very successful engineering of existing ideas
Professor Bill Pulleyblank, IBM
"The trouble is, if you get increased speed by increasing frequency, it gets to be really inefficient - you get to the point of diminishing return. It takes more and more power to get higher frequency.

"People across the industry are saying we are pretty much reaching the units of frequency and we need to find another way," he says.

Blue Gene runs at much lower frequency than other supercomputers. Each chip will eventually run on a much slower speed than on a high-end PC, too.

Desktop computers run an average of three times faster, but use much more power. Blue Gene has been designed to use tens of thousands of chips but less power.

With its 64 full racks, each holding 1,024 processors, it is expected to reach a peak theoretical performance of 360 teraflops this year.

Within two years, says Professor Pulleyblank, petaflop speeds will be reached - a thousand trillion calculations a second. But Moore's Law has not just successfully driven the chip industry's power advances for the last 40 years.

It has also made it possible for cheaper, easier, and faster supercomputers to be built using "off the shelf" technology. That means their massive brain power can solve more problems.

'Worthy workers'

There are hundreds of supercomputers across the world, working out highly complex problems across science and society.

They are doing worthy jobs such as climate prediction, tsunami prediction, and working out the structures of proteins to improve medicines.

As supercomputing matures, it is starting to branch out to help society work out equally complex, but perhaps more mundane problems.

Professor Pulleyblank has been charged with heading up IBM's new Center for Business Optimization (CBO) to bring supercomputing out of the labs.

"It was a bit of a bummer actually," he jokes. "I was all set to coast on Blue Gene for a while, do the talk-show tour."

But the CBO is "going boldly where no-one goes before". Part of it is about developing new methods and algorithms to improve computation power.

Blue Gene (Image: IBM)
Blue Gene/L is the fastest computer in the world... for now
The other side is about "making supercomputing real"; it has traditionally been something that is not easy to integrate into the everyday world.

Making it real includes using supercomputing to manage postal services, create Volvo car safety simulations, special effects, airport flows, and in a host of other settings.

But it is in the world of healthcare, thinks Professor Pulleyblank, where supercomputing power combined with databases of information could really have an impact in the coming years.

For instance, doctors could manipulate very sophisticated, real-time 3D images of a person's heart generated from a CAT scan. This would allow a doctor to simulate surgery techniques.

A partnership with the Mayo Clinic in the US meant supercomputing power could play a big part in digitising and combining more than 100 years of patient history.

A data warehouse was created so that common responses to various treatments could be spotted and predictive models constructed.

Making this a reality should allow treatments to be far more targeted and precise, with prescriptions being based on DNA modelling and analysis.

"You look at something like that and the individual value is clear," says Professor Pulleyblank.

"But you look at the societal value and all of a sudden you are beginning to take some of the cost out of your medical system because prescription and diagnoses is more precise."

More Moore

But there are technological limits to chip technology. Dr Moore himself predicts that the law has about 10 to 20 years before a fundamental limit to how many transistors can fit effectively onto a chip is reached.

Global researchers are delving into quantum computing and nanotechnologies to try to come up with alternatives for silicon-based chip technology.

But although one eye will be kept on these developments, the fastest computer in the world will be sticking with trusted silicon for the foreseeable future, explains Professor Pulleyblank.

Medium Resolution Imaging Spectrometer images show Hurricane Isabel (Image: Esa)
Supercomputers can help scientists predict climate change
"One of the things we were concerned with is that if it is going to work, either everything works, or nothing works at all.

"So we did a lot of things to minimise risk on the project and we made technological decisions that were based on, for the most part, standard stable technology.

"We were already taking enough of a risk by putting 130,000 processors into this system and trying to make sure they can all work together."

The team had already made slightly risky technology decisions in three areas that turned around and bit them.

"Using carbon nanotube transistors is a wonderful idea, but we needed things that were engineered at the level of being market-ready so we could build a machine at the scale of Blue Gene out of it."

The computing industry often gets new ideas that look impressive in the labs, but which are not reliable or scalable enough to use in large systems like supercomputers, he says.

"If you look at our industry, part of it is innovation and a large part is about very successful engineering of existing ideas."

In 10 years' time, it is an area that IBM may revisit, says Professor Pulleyblank.



SEE ALSO
US top of supercomputing charts
09 Nov 04 |  Technology
Supercomputer breaks speed record
05 Nov 04 |  Technology
Nasa powers up with supercomputer
06 Aug 04 |  Technology
Earth Simulator delights scientists
01 Oct 03 |  Science/Nature
Spain unveils supercomputer plans
29 Feb 04 |  Technology

RELATED INTERNET LINKS
The BBC is not responsible for the content of external internet sites



FEATURES, VIEWS, ANALYSIS
Has China's housing bubble burst?
How the world's oldest clove tree defied an empire
Why Royal Ballet principal Sergei Polunin quit

PRODUCTS & SERVICES

Americas Africa Europe Middle East South Asia Asia Pacific