BBC NEWS Americas Africa Europe Middle East South Asia Asia Pacific Arabic Spanish Russian Chinese Welsh
BBCi CATEGORIES   TV   RADIO   COMMUNICATE   WHERE I LIVE   INDEX    SEARCH 

BBC NEWS
 You are in: Sci/Tech
Front Page 
World 
UK 
UK Politics 
Business 
Sci/Tech 
Health 
Education 
Entertainment 
Talking Point 
In Depth 
AudioVideo 


Commonwealth Games 2002

BBC Sport

BBC Weather

SERVICES 
Wednesday, 29 August, 2001, 18:32 GMT 19:32 UK
Doing science by stealth
Science by stealth
Soon your computer could be doing science without your permission
By BBC News Online technology correspondent Mark Ward

Scientists have found a way to coerce computers into doing science without the consent of their owners.

By exploiting basic functions of web servers, a group of US scientists have been able to make the machines carry out a small part of a much larger computation.

The researchers believe that the technique could be used to turn the web into a powerful distributed computer.

But they said their technique for covert computation was cumbersome, and needed to be refined before it was more widely used.

Parasites abound

At the moment, enrolling your computer in a distributed computing project - such as Seti@home, the search for extraterrestrial life - involves downloading special software that can run while the computer is otherwise idle.

But all this could change as US scientists subvert an error-checking procedure used by all web servers into a means for carrying out computations.


Parasitic computing represents an ethically challenging alternative... as it uses resources without the consent of the computer's owner

Albert-Laszlo Barabasi
When any information is sent across the internet it is split up into small chunks or packets that travel, often independently of each other, to their common destination.

Each packet is stamped with information about its source and destination, as well as a value that reveals how many bits it contains.

When a web server receives a packet of data it performs a quick calculation to see if the number of bits received is the same as the number sent.

If this number or "checksum" is different to the one stamped on the packet, this reveals that the packet has been corrupted during transit. Corrupted packets are discarded.

One of the internet's basic standards, called the Transmission Control Protocol (TCP), governs this error checking process.

Scientists from the physics and computer science departments at the University of Notre Dame in Indiana, US, are using this error-checking procedure to carry out "parasitic computing".

In the journal Nature, physicist Albert-Laszlo Barabasi and colleagues show how to subvert error checking to carry out more complex calculations, and force a web server to inadvertently take part in a distributed science project.

Short salesmen

The scientists tested their ideas by using web servers to find the correct solution to an example of a mathematical conundrum known as the "travelling salesmen problem". This involves working out the shortest route that a fictional salesman would have to take to visit all possible locations on a hypothetical map.

The more locations on the hypothetical map means more potential routes, and the longer it would take any single computer to crank through all possible combinations.

But by sharing the job of working out which route is shortest, the total time it takes to solve any particular travelling salesman problem can be vastly reduced.

Professor Barabasi and his colleagues used one computer to generate possible solutions to a travelling salesman problem, and then used parasitic computing to make lots of web servers perform the calculations on each candidate solution.

Because of the way TCP works, only solutions to their travelling salesman problem were returned to the researchers. All others, because they produced invalid checksums, were discarded.

The scientists said their technique needed refinement because it took far longer for a web server to carry out a calculation on their behalf than it did to check that packets of data were intact.

Widespread use of it could slow down the rate at which a web server receives data. "Parasitic computing represents an ethically challenging alternative for cluster computing, as it uses resources without the consent of the computer's owner," wrote the researchers in their paper.

But they said it had potential to harness far more computers than take part voluntarily in projects such as Seti@home.

 WATCH/LISTEN
 ON THIS STORY
The BBC's Julian Siddle
"Researchers forced the computer servers to hand over their power"
See also:

08 Aug 00 | Sci/Tech
Screensavers could save lives
28 Jun 00 | Sci/Tech
Powering up the Grid
14 Aug 01 | Sci/Tech
US building mega computer
21 Jan 00 | Sci/Tech
Seti@home gets an upgrade
26 Jul 00 | Sci/Tech
Unweaving the world wide web
15 May 00 | Sci/Tech
Half the internet is going nowhere
Internet links:


The BBC is not responsible for the content of external internet sites

Links to more Sci/Tech stories are at the foot of the page.


E-mail this story to a friend

Links to more Sci/Tech stories