By Mark Ward
BBC News Online technology correspondent
It is not just the music industry that has a problem with file-sharing.
Keen file swappers face speed limits
Net service providers and network managers are struggling to cope with the deluge of data that peer-to-peer systems can generate.
Many are adopting tools that limit how much of a network file-sharing systems can sequester.
Some organisations are imposing daily limits on how much people can download. Persistent offenders who regularly exceed their quota are being punished with long-term download limits.
Ever since the emergence of Napster, file-sharing systems, that allow people to search through and download files from other net users, have been hugely popular.
In May this year the Kazaa file sharing program became the net's most downloaded program. More than 230 million copies of it have been downloaded.
But the huge amount of traffic that Kazaa and other peer-to-peer systems generate is swamping the networks of many organisations.
File swapping networks are causing problems for net firms
Many organisations based their whole business around particular patterns of use, said Matthew Finnie, vice president at European backbone supplier Interoute.
The typical pattern of human web browsing behaviour involves visiting a few websites, receiving and sending a lot of e-mail, downloading a few files, some net-based chatting and perhaps some online gaming.
But, said Mr Finnie, peer-to-peer systems turn expectations of demand on their head because they involved machines talking to machines constantly.
File sharing systems consume as much bandwidth they can get and send large amounts of data flying back and forth as people search for the files they want and start to download them.
The biggest files on peer-to-peer systems are hundreds of megabytes in size and users are happy to stay online for hours and constantly use the full bandwidth available to download them.
Many file-sharing systems organise around so-called "super-hubs" which host lots of files on a fast server with high-speed net access.
For example, bandwidth congestion at the University of Texas was found to be due to 2% of its users consuming 50% of its bandwidth. The University of Illinois at Urbana-Champaign found that the top 10% of users were generating 58% of net traffic.
"Here you are dealing with something that can seriously distort what you think your network needs," he said.
Music is very popular on file sharing systems.
This caused problems because of the way that net firms and other organisations bought their bandwidth, said Mr Finnie.
Organisations that have a fixed upper limit on their bandwidth may find that everything slows down as any traffic beyond that capacity will be put in a queue.
Other organisations that buy extra bandwidth on demand can find they are always using more than they expect largely because of file sharing traffic.
Mr Finnie said Interoute was now offering customers tools that let them inspect the packets of data crossing their network to pick out the peer-to-peer traffic.
This will help them limit how much bandwidth such applications can grab for themselves.
Firms like Packeteer and NetReality offer network management systems that let technology staff spot file sharing traffic and keep it under control.
Other organisations are using different methods to tackle the growth of file sharing.
For some time North Dakota State University has used a system that gives its students a daily quota of how much they can download.
The institution allocates 600 megabytes of network usage to each user every day. Average users consume about 100 megabytes.
The quota includes how much someone downloads from the net as well as how much is downloaded from them.
Anyone exceeding this daily quota is placed on a lower speed connection they must share with other heavy users.
North Dakota also has a "probation" system which gradually gives these offenders more and more bandwidth over a few days. Persistent offenders may find themselves permanently on probation.