BBC NEWS Americas Africa Europe Middle East South Asia Asia Pacific Arabic Spanish Russian Chinese Welsh
BBCi CATEGORIES   TV   RADIO   COMMUNICATE   WHERE I LIVE   INDEX    SEARCH 

BBC NEWS
 You are in: Sci/Tech
Front Page 
World 
UK 
UK Politics 
Business 
Sci/Tech 
Health 
Education 
Entertainment 
Talking Point 
In Depth 
AudioVideo 


Commonwealth Games 2002

BBC Sport

BBC Weather

SERVICES 
Wednesday, 14 June, 2000, 11:39 GMT 12:39 UK
Web links that stick
404
404: Another dead end
By BBC News Online internet reporter
Mark Ward

Broken links between web pages are like diversions on the information highway that stop you taking the quickest route to your destination.

But the frustration caused by dead links could soon be at an end as technologies are developed to ensure that links stay live.

You know when you have clicked a broken link because you get a "404 error code", rather than the web page you wanted.

The world wide web is built around the ability to hop from one page to another via a hyperlink, but as the web grows the number of links that lead nowhere are growing too.

Knowledge gap

Hyperlinks break when the page they point to has been moved, renamed or simply deleted.

Often the people doing the renaming or moving do not know that links to that page have been broken.

"The inbound links are invisible but important because that's where a lot of your traffic can come from," said Franck Jeannin, chief executive of Linkguard, one company developing technology to keep links live.

Linkguard has spent the last few months building up a 40 terabyte database that logs all the links on the web. "It's a lot bigger than we thought originally," said Mr Jeannin.

50 ways to leave your web page

Linkguard now knows that there are on average 52 links per page on the world wide web.

The numbers of broken links are growing in line with the web - 12 months ago only 7% of links were broken, now it is over 10% said Mr Jeannin.

He said Linkguard will keep an eye on those sites that subscribe to its services and tell the sites holding in-bound links if a connection breaks.

Eventually Linkguard is planning to use discrete software programs called agents to watch links and tell the webmasters of any affected sites when they are updated or changed.

Page impression

Also working on the vexing problem of broken links are a group of academics from the University of California led by computer scientist Thomas Phelps.

The team proposes giving every web page a "lexical" signature that is created by analysing the content on that page.

The signature is generated by choosing the words or phrases common on that page but rare on others.

Using these criteria Mr Phelps and colleagues claim signatures need only be five words long.

If the destination page disappears, search engines that can use these signatures would try to find the relevant signature and relocate the page.

See also:

15 May 00 | Sci/Tech
Half the internet is going nowhere
03 Aug 99 | Sci/Tech
Search engine seeks billion URLs
16 Feb 00 | Sci/Tech
Special report: The web under attack
Internet links:


The BBC is not responsible for the content of external internet sites

Links to more Sci/Tech stories are at the foot of the page.


E-mail this story to a friend

Links to more Sci/Tech stories