As phones get smarter keeping kids safe gets harder
Mobile content regulation won't keep our kids safe, says regular commentator Bill Thompson.
When the MPs on the House of Commons Select Committee on Culture, Media and Sport called for greater control over online content to protect young people from exposure to inappropriate material they were roundly criticised for failing to understand how the internet works.
Suggestions that content-hosting sites like YouTube and Flickr should review material before they were posted were especially ridiculed. Observer columnist John Naughton pointed out that at Flickr, "uploads have been between 1,400 and 4,500 images a minute", making the task somewhat less manageable than the committee seemed to realise.
But a couple of weeks later telecoms regulator Ofcom has agreed that content delivered to mobile phones should continue to be restricted. It suggested that although the current self-regulatory scheme managed by the Independent Mobile Classification Body is working it could be made a bit stronger in some ways.
Yet nobody is writing angry blog posts about how foolish Ofcom is, or how the mobile operators are taking away our online freedoms by limiting access to some types of material, and in fact the general view is that Ofcom and the mobile industry are doing it right, and the system just needs to be tweaked.
The reason is that the filters and controls Ofcom is talking about only apply to content that is provided by the mobile operators themselves, either directly or through third party providers they have done deals with.
Age-based filtering of these games and videos and news services was introduced four years ago when content providers such as Bango realised that self-regulation in this area was not only politically astute but also morally the right thing to do.
It persuaded the mobile operators to sign up to a scheme through which it flags "adult" content it provides, with networks restricting access unless the phone is registered to someone over 18.
It's not a perfect system, as contract phones are unrestricted so parents who give them to their kids need to tell the network operator to block them, but it has certainly been effective in keeping the mobile operators out of trouble with the regulator.
And of course it doesn't apply to "user-generated" content such as photos taken with phones and exchanged by MMS and Bluetooth, so the mobile safety advice from children's charities like Childnet International is still needed.
But it provides a model for reasonably sensible content management that seems to offer a good degree of protection without constraining other users.
Unfortunately the guidance comes from a time when most mobiles were not up to running a proper web browser and the "mobile web" meant access to the relatively small number of websites inside the network providers own "walled garden".
The image quality available on the average phone would have left even hardcore pornographic images as an exercise in imaginative interpretation of a pixellated blur, if they were accessible at all.
Filtering systems have been show to work poorly with web content
With today's smartphones these constraints have gone, and as the differences dissolve and every phone turns into just another network node it may be difficult to retain the good aspects of the mobile content model.
The Mobile Broadband Group's code of practice recognises the problem, noting that "mobile operators have no control over the content that is offered on the internet" and agreeing only to "offer parents and carers the opportunity to apply a filter to the mobile operator's Internet access service so that the Internet content thus accessible is restricted".
But as we know from many years of painful experience, web filtering does not work. The filters either let through material that we would like blocked or, far more often, block material that is perfectly acceptable but happens to disagree with the political, cultural or religious standards of the companies behind the filtering tool chosen.
As we move from mobile content provided by the network operators to the content available over the internet all the carefully devised plans break down, and we are left with exactly the same problems that face parents who give their children laptops or let them use home computers. There are filters available, but they do not the job, yet turning them off means our children can look at sites we may not want them to see.
History seems to demonstrate that the way things work on a handset doesn't easily transfer across to the fuller browsing experience, as we've seen when it comes to paying for data and content.
Millions of us pay over the odds for the few bits that make up a text message, or the odd megabyte of e-mail downloaded to the phone, and the ringtone market was, for a while, a classic example of how to create value from nothing, while we refuse to shell out for even the best online news service.
When it comes to content filtering the problem is not that the audience won't pay, it is that all of the many arguments against regulating online content start to apply to mobile access.
Ofcom may just have given its approval to a system that will soon become irrelevant to the vast majority of mobile users, leaving parents still with no effective way to manage their children's access to online material.
Bill Thompson is an independent journalist and regular commentator on the BBC World Service programme Digital Planet.