Disclosure and diversity are the key to trust for tomorrow's computers, says technology critic Bill Thompson.
Sony protected CDs by artists such as Alicia Keys with controversial code
There is much to be said in favour of a rigorous procedure for testing, authorising and checking all computer programs, especially any driver software that is installed with high levels of privilege as a component of the operating system.
Such a testing regime would, if backed up by a code signing mechanism which used strong cryptography, and based around operating system access to a hardware-based authentication mechanism, make it almost impossible for spyware, adware and other malicious programs to install themselves surreptitiously.
It would also have saved Sony from the debacle where two copy-protection mechanisms distributed on some Sony BMG audio CDs opened up Windows to remote attack and, by hiding themselves using techniques normally favoured by hackers and virus writers, caused a PR disaster for the company from which it is yet to recover.
It would have forced the manufacturers of PC games who want to protect their disks from being copied to have the software they want to use validated to work properly.
As a result the developers of StarForce, a copy protection system shipped with games like King Kong, D-Day and Sniper Elite, would not be faced with online protests from customers who find that the "hidden" driver that stops them copying game disks can also degrade CD/DVD performance and cause hardware errors.
StarForce denies its software causes any hardware problems.
And it would provide a simple way for concerned parents to make their children a little safer online.
In fact, such a system already exists. The Trusted Computing Group has been working for some years on a hardware-based system and it is already built into the motherboards of many Intel-based computers, though we'll have to wait for Windows Vista before most people can use its features.
So can we look forward to a more secure future?
If only life was so simple, for while the technology to improve the security and stability of our computer systems using hardware-based authentication and code signing is on its way, it may end up making things a lot worse for users.
Unless we are careful the tools which could make us a lot safer and give us more power over what we do with the hardware we own and the software we license - few programs are actually "sold", not even free software - will instead be used to take control away from us.
At the moment the companies behind trusted computing do not trust their customers at all.
They want to use digital rights management to control what we can do with content we have purchased, they want to make sure we don't install programs or new hardware that they haven't approved, and they want to be able to monitor our use of the expensive computers we own.
This has to change. The focus must be on us trusting them not to take away our rights under copyright law or our ability to do what we want with our property. It should be about users being reassured that personal information is not being squirreled away and hidden software isn't being installed.
I have a very nice car, and I try to take good care of it. It runs on petrol, but I want the freedom to fill it up with diesel and destroy the engine. It's my engine, after all.
The same goes for my computer. I want the freedom to write, compile and run my own code, take risks with dodgy software I've downloaded from the net and even break the law and risk prosecution by playing unlicensed music or running cracked software.
Partly this is because I value my freedom, but surely it makes me a better person if I have the choice and exercise it responsibly, whether I'm driving or surfing the web?
The core technologies which underpin the trusted computing model have a lot to recommend them, but we need to change the way that they are being deployed, and this is going to require action on the part of governments, not just pressure groups and online hacktivism.
First, we need full disclosure when programs are installed or modified or when access to the internet is sought. At the moment my firewall tells me when programs try to get online, so I can block attempts by programs to "phone home" if I'm not happy with it.
And on my Windows box, Microsoft's anti-spyware does a reasonable job of telling me when a program is updated; so I'd know if spyware was being installed.
Music makers and game firms are trying to limit illegal copying
But this needs to go further, perhaps with a legal requirement on the part of all software vendors to have an impact statement that is displayed on installation and when a program first runs or is accessed.
This could be modelled on the financial services disclaimers, but written in non-technical language. So instead of a click through licence that no-one ever reads, the XCP driver installed by Sony CDs would say "this program will hide itself so you can't uninstall it easily. It will change the way your CD works..." and so on.
The second thing we need is diversity when it comes to code signing. If my computer is set to run only signed software or read only signed documents, then who can sign what becomes far more than a matter of technology, it becomes a political issue.
We must not settle for a closed platform which allows the hardware vendor or the operating system supplier to decide, so it is time for governments to intervene and to ensure that we have an open marketplace for code signing.
The simplest way to do this is to give the process a statutory backing and then issue licences, just like we do for many professional and financial services.
Nobody objects to having licences for medicine, why not licences for code signing? Microsoft and Google could have one, but so could the Free Software Foundation and Cambridge City Council, provided they met the standards.
This is not about government interference, it is about using the power of the state to ensure a free and fair market for what will be a key service in five years' time. Without it we will all find ourselves having to comply with whatever rules and regulations are agreed by the big corporate players.
This doesn't solve all of the problems with what the trusted computing group want to do, but I believe it goes a long way. But if we want to be able to sign our own code, authenticate our own Linux kernels or Basic code in five years' time then we need to act now.
Bill Thompson is a regular commentator on the BBC World Service programme Go Digital