[an error occurred while processing this directive]
BBC News
watch One-Minute World News
Last Updated: Monday, 15 January 2007, 14:19 GMT
For want of a file, the net was lost
Earthquake damage in Taiwan, AFP/Getty
The quake caused widespread damage in Taiwan
Regular columnist Bill Thompson wonders if the net will be robust enough to cope with all the calls we will make on it in the future.

One of the net's more persistent founding myths is that it was designed to survive nuclear war and to ensure that even after the bombs had fallen there would still be communications between surviving US military bases.

It isn't true, of course. The early days of the Arpanet, the research network that predated today's internet, were dominated by the desire of computer scientists to find ways to share time on expensive mainframe computers rather than visions of Armageddon.

Yet the story endures, and lies behind a generally accepted belief that the network can survive extensive damage and still carry on working.

This belief extends to content as well as connectivity. In 1993 John Gilmore, cyberactivist and founder of the campaigning group the Electronic Frontier Foundation, famously said that "the net interprets censorship as damage and routes around it", implying that it can find a way around any damaged area.

This may be true, but if the area that gets routed around includes large chunks of mainland China then it is slightly less useful than it first appears.

Sadly, this is what happened at the end of last year after a magnitude 7.1 earthquake centred on the seabed south of Taiwan damaged seven undersea fibre-optic cables.

The loss of so many cables at once had a catastrophic effect on internet access in the region, significantly curtailing connectivity between Asia and the rest of the global internet and limiting access to websites, instant messaging and e-mail as well as ordinary telephone service.

Bill Thompson
setting technical standards is not enough to guard against network bottlenecks like the cables running in the sea off Taiwan
Bill Thompson
Full service may not be restored until the end of January since repairs involve locating the cables on the ocean floor and then using grappling hooks to bring them to the surface so they can be worked on.

The damage has highlighted just how vulnerable the network is to the loss of key high-speed connections, and should worry anyone who thought that the internet could just keep on working whatever happens.

File mangler

This large-scale loss of network access is a clear example of how bottlenecks can cause widespread problems, but there are smaller examples that should also make us worry.

At the start of the year the editors of the popular DeviceForge news website started getting complaints from readers that its RSS feed had stopped working.

RSS, or "Really Simple Syndication", is a way for websites to send new or changed content directly to user's browsers or special news readers. More and more people rely on it as a way to manage their online reading.

The editors at DeviceForge found that the reason its feed was broken was that the particular version of RSS it was using, RSS 0.91, depended on the contents of a particular file hosted on the server at www.netscape.com.

It looks as if someone, probably a systems administrator doing some clearing up, deleted what seemed to be an unneeded old file and as a result a lot of news readers stopped working.

Sign in net cafe, AFP/Getty
The quake severed data cables under the sea
Having what is supposed to be a network-wide standard dependent on a single file hosted on a specific server may be an extreme case, but it is just one example of a deeply-buried dependency within the network architecture, and there must be others.

This is going to get worse. The architecture of the internet used to resemble a richly-connected graph, with lots of interconnections between the many different levels of network that work together to give us global coverage, but this is no longer the case.

The major service providers run networks which have few interconnections with each other, and as a result there are more points at which a single failure can seriously affect network services.

Future proof

There may even be other places where deleting a single file could adversely affect network services.

If we are to avoid these sorts of problems then we need good engineers and good engineering practice.

We have been fortunate over the years because those designing, building and managing the network have cared more for its effective operation than they have for their personal interests, and by and large they have built the network around standards which are robust, scalable and well-tested.

But we need to carry on doing this and make things even better if we are going to offer network access to the next five billion users, and this is getting harder and harder to do.

In the early days the politics was small-scale, and neither legislators nor businesses really took much notice, but this is no longer the case as we see in the ongoing battles over internet governance, net neutrality, content regulation, online censorship and technical standards.

cable ship off Taiwan, AFP/Getty
Repairing broken undersea cables is a daunting task
Bodies like the Internet Society, the International Electrotechnical Commission and the Internet Engineering Task Force still do a great job setting the standards, but they, like the US-government appointed Icann, are subject to many different pressures from groups with their own agendas.

And setting technical standards is not enough to guard against network bottlenecks like the cables running in the sea off Taiwan, since decisions on where to route cables or how the large backbone networks are connected to each other are largely made by the market.

The only body that could reasonably exert some influence is the International Telecommunications Union, part of the UN. Unfortunately its new Secretary-General, Hamadoun Toure, says that he does not want the ITU to have direct control of the internet.

Speaking recently at a press conference he said: "It is not my intention to take over the governance of internet. I don't think it is in the mandate of ITU". Instead he will focus on reducing the digital divide and on cyber-security.

These are worthy goals, but they leave the network at the mercy of market forces and subject to the machinations of one particular government, the United States. If we are going to build on the successes of today's internet and make the network more robust for tomorrow we may need a broader vision.


Bill Thompson is a regular commentator on the BBC World Service programme Digital Planet



RELATED INTERNET LINKS
The BBC is not responsible for the content of external internet sites



FEATURES, VIEWS, ANALYSIS
Has China's housing bubble burst?
How the world's oldest clove tree defied an empire
Why Royal Ballet principal Sergei Polunin quit

PRODUCTS & SERVICES

Americas Africa Europe Middle East South Asia Asia Pacific