[an error occurred while processing this directive]
BBC News
watch One-Minute World News
Last Updated: Sunday, 25 July, 2004, 07:12 GMT 08:12 UK
Pushing computers to the limit
BBC Click Online's Spencer Kelly looks at the ever increasing speed of processors and wonders how long computers can keep getting faster.

Intel chip plant
Researchers are packing more and more into silicon chips
The more transistors can fit on a chip, the more processing power a computer has.

The closer together the transistors are, the quicker they can transfer signals between them, and so the faster they run.

And researchers have been squeezing more and more onto chips at a pretty constant rate.

In 1965 Gordon Moore, co-founder of chip giant Intel, stated that the number of transistors on a chip will continue to double every 18-24 months.

In the next 40 years, transistor counts went from 2,300 to 55 million, as Moore's Law continued to hold true.

"In many respects we're still in the infancy of microprocessors," said Gordon Moore.

"There are many things we can do to make them more capable and we're pushing those as hard as we can."

Light technology

Today, Japan's Earth Simulator is the world's fastest supercomputer, tearing along at over 35 million million calculations a second.

But how did we get here? How has something so powerful been made to fit into such a relatively small box?

Close-up of circuit board
It's got to the stage where you can't use all of the chip at the same time, you have to use parts of it at a time to keep heat down to a minimum
Dr Dave Watson, IBM
In a process called lithography, circuits are etched onto silicon by shining light through a mask.

As we learn to use shorter wavelengths of light, we can make finer and finer circuits.

"There are difficulties in continuing to refine the current lithography process indefinitely and people are moving towards different wavelengths of light, moving away from visible light into what's called extreme ultraviolet," explained Dr Dave Watson from IBM.

"That offers the opportunity to use a shorter and shorter wavelength, which allows you to etch smaller and smaller components into integrated circuits."

Even if you could keep refining the lithography process, there are other problems associated with bringing things really close together - chips are getting hot.

"Chips use more power to create heat in themselves than they actually do to work, and that's because of the density of components on the chip," said Dr Watson.

"It's got to the stage where you can't use all of the chip at the same time, you have to use parts of it at a time to keep heat down to a minimum.

"If you look at current chip technology, the actual core component in the chip can only be about 25 atomic layers thick of silicon, and the problem is that you get jumps of electricity between different components.

"So you have two copper conductors separated by 25 atomic layers and the electrons can literally jump from one side to the other and you get shorts in chips," he explained.

"If you go much smaller you simply cannot control the electrons within the chip."

Quantum leap

Developers have encountered barriers like this before, but Moore's law has always found a way around them.

The change from aluminium to copper as the conductor in chips meant they could use 30% less power.

Newer, more expensive chips made from gallium arsenide instead of silicon could prove 40% faster.

Gordon Moore
Moore: Famous for his prediction about processing power
But even if you can overcome the heat, power and materials problems, you cannot just go on shrinking forever.

It is estimated that sometime in the next decade, Moore's Law will reach its final, impenetrable barrier - you cannot make a wire thinner than an atom.

If, by the time we reach the atom barrier, we still want more oomph in our machines, scientists will have to do things differently.

At Oxford University, under an electron microscope, there are the hazy beginnings of a very different type of computer - a quantum computer.

In such a computer, data is processed by exploiting the strange qualities of quantum physics and the building blocks of computation are not transistors but caged atoms or Qubits.

"The difference between quantum computing and classical computing is that whereas in a classical computer you work with bits that at any given moment can be either 1 or 0, in a quantum computer we use what we call Qubits, and I know it sounds weird but each Qubit can be both 1 and 0 at the same time," said Andrew Briggs, Professor of Nanomaterials at the University of Oxford.

"The benefit that brings you is that in a quantum computer you can try different solutions to a problem simultaneously."

An array of Qubits is made by lining up rows of caged atoms. Instead of having to work out the answer to a problem, an array of Qubits will come up with all possible answers in far fewer steps that a conventional computer forced to rely on classical physics.

A quantum computer may not be needed for word processing and answering e-mails.

But there are some jobs which may require more power than Moore's Law could ever provide, so it is good to know that one day that law could be broken.

Moore predicts more advances
11 Feb 03  |  Technology
Quantum leap for secret codes
05 Jun 03  |  Technology
'Father of the computer' honoured
07 Jun 04  |  Manchester
Quantum computer draws closer
21 May 03  |  Science/Nature
Computer chips pushed to edge
26 Feb 03  |  Technology
The super-fast future of computing
14 Jun 04  |  Science/Nature

The BBC is not responsible for the content of external internet sites


News Front Page | Africa | Americas | Asia-Pacific | Europe | Middle East | South Asia
UK | Business | Entertainment | Science/Nature | Technology | Health
Have Your Say | In Pictures | Week at a Glance | Country Profiles | In Depth | Programmes
Americas Africa Europe Middle East South Asia Asia Pacific