By Mark Ward
BBC News Online technology correspondent
Most home net users would love to have a link that can handle gigabits of data every second.
Radio astronomy generates huge amounts of data
But for some folks such high speeds are simply too slow.
Some science experiments can generate so much data that it still takes hours to send the information across the net so it can start to be analysed.
Technologies like the Grid look set to help get the data to where it needs to be and fast net providers are looking at better ways to get data moving.
One of the recent developments in radio astronomy involves linking up a distributed network of telescopes to form a giant, virtual instrument as big as a continent.
The good thing about this technique, which goes by the formidable name of Very Long Baseline Interferometry (VLBI), is that it produces very high quality radio images of astronomical phenomena such as exploding supernovae, pulsars and black holes.
The downside is the vast amounts of data that it generates.
The European VLBI project links 16 telescopes that each generate one gigabit of data per second over the 25 days of an observation session.
The end result is a pile of data tens of terabytes deep recorded on to huge numbers of magnetic tapes.
Dai Davies, director of Dante, which provides high-speed networks to Europe's research institutes, said that before now the highest data transfer speed was achieved by putting the tapes in a van and driving them to where they need to be analysed.
Delivery vans can carry lots of tapes at the same time which means that Europe's roads have a relatively high bandwidth.
"You can send a few hundred megabytes per second through DHL," he said.
Sea floor sensors are helping understanding of the planet
The problem with delivering data by road is the time it takes for the tapes to travel from where they are collected to where they are analysed. Sometimes it arrives weeks after an observation session has finished.
Dr Davies said the basic protocols of the internet got in the way when huge amounts of data had to be sent.
At very high speeds the net's Transmission Control Protocol (TCP), that helps ensure data reaches its intended destination, slows things down because it checks to see if packets have arrived.
"TCP was designed along with similar protocols in the 1970s when they were coping with line speeds in the kilobits," he said.
Dr Davies said Dante was looking at changing the basic protocol it used to transport data and at alternative ways of organising its network to provide capacity on demand.
By moving away from pure net protocols and hardware it should become possible to give research institutions bandwidth as it is needed.
Mike Garrett, director of the Europe's Joint Institute for VLBI, said it was working with Dante on ways to get data from its 16 telescopes to its supercomputer for analysis in real-time.
He also said it was carrying out a pilot Grid project to see if it was possible to spread the analysis load by sending out the data to a cluster of computers.
And it is not just astronomers who have much data to deal with
Cardiff University is currently installing a gigabit network as part of the Welsh e-science centre being created at the institution.
Tom Wiersma, manager of the network at Cardiff University, said the fast network will help some of its scientists do more with the huge amount of data they gather.
As an example, Cardiff researcher Adam Schultz is using huge numbers of seabed sensors to monitor geological activity and refine theories about the workings of the Earth's interior.
Mr Wiersma said that the high-speed network would make it much easier for scientists like Dr Schultz to move information around and added that broader work on packaging data for analysis would be useful too.
"Scientists need to be able to connect to the Grid, take their data and find the computing resource they need to crunch it without having to worry about preparing the data, splitting it up and so on," he said.