Careless programmers need to be less like fighter pilots and more like responsible pilots, argues technology analyst Bill Thompson.
Programmers need to write in safer language
Anyone running Microsoft Windows 2000 will have been invited to install yet another security patch this week as the company's automatic update software struggles to tell users about a hole in a core Windows component.
The problem, with one of the dynamic link libraries that Windows uses to provide web services, is potentially serious because it could allow an attacker to take over a user's computer.
There are already programs available to make their job easier, which strongly implies that this problem was discovered some time ago by people who decided to use it to take over other people's computers rather than report it to Microsoft.
Now that the fix is available and being installed, the number of vulnerable computers will fall, although our experience with the SQL Slammer worm showed that many users and systems administrators seem to disregard security advice and leave their systems unpatched.
Even so, there will be more security alerts in the next few days, if history is any guide, and more holes for attackers to crawl through.
And they will not just be in Microsoft software: recent security scares have featured the Oracle database management system, the Domino web server and Sun's Solaris operating system.
Nor will they only be in commercially produced software where the source code is not available to check.
Many of the biggest and most important open source projects, from Sendmail to the Linux operating system kernel itself, have had their own security alerts recently.
Even Opera, the browser of choice for those who do not trust Microsoft's Internet Explorer because of its history of security problems, has its own bugs and required a patch only two weeks ago.
The real issue is not about a particular company, or a particular program, or a particular way of developing software.
It is about the increasingly irrational decision that many programmers make to write their code in programming languages which are inherently insecure.
Most of the bugs that we have seen, including the recent Opera, Windows 2000 and Sendmail problems, have the same underlying cause. The program has been written so that an attacker can send it data in such a way that it overwrites some of its own data storage and crashes.
These so-called buffer overflow vulnerabilities are not common when you consider the millions of lines of code that make up a typical computer system.
But they are there because so many of our key programs are written in languages like C which leave the programmer responsible for ensuring that data is properly checked, and buffers are properly managed
Jon Lasser, a security consultant from the US, thinks that too many of today's programmers see themselves as like fighter pilots, taking their systems to the limit.
We will never eliminate bugs and security holes completely, but we can certainly improve on today's awful state of affairs
Instead, he argues, they need to act more like commercial pilots who have to behave responsibly and consider passenger safety.
The first step to safety is to choose a tool that allows you to do the job properly and without too much risk.
Sure, I can get a screw into a piece of wood using a hammer, but the result is neither as elegant nor as secure as it would be if I used a screwdriver.
I first learned to program in a language that makes C look safe and secure - it is called BCPL and I recommend it only to the brave.
I now write in PHP and hack other people's Perl, but I am not writing safety-critical software.
In this connected world, where fewer and fewer of our computers are completely isolated from the network, responsible programming is becoming more and more important.
We will never eliminate bugs and security holes completely, but we can certainly improve on today's awful state of affairs.
Just as drink-driving has become socially unacceptable as well as being illegal, maybe we need to exert pressure on programmers to stay away from the tools and languages that allow them to make stupid mistakes, and refuse to use tools which have been developed without due care.
I can see the posters now: "Don't code in C: You know it makes sense!"
Are coders shirking their responsibilities? Do you agree with Bill?
Your criticisms are right on the money. An additional source of exposures comes from the economic pressures to get the product out as quickly as possible. Quality is seen as an add-on rather than a design assumption.
John Seal, USA
I think coders would be able to write good quality code, if they did not have such tight deadlines to stick to. I realise there is a problem in the software engineering industry, but I feel this has more to do with poor management. When will companies realise there is no sense in getting people who have no technical knowledge to manage software projects. This can only lead to poor quality. This in turn can only lead to the security problems highlighted by Bill Thompson.
Sean Will, UK
Interesting commentary, but I think by "fighter pilots" you're mainly describing either solitary programmers, or those in small, barely professional programming groups. I guess that if one is to become a commercial pilot, one should not jump into a Sopwith Camel or a Hawker Harrier. In other words, you maybe shouldn't be using overgrown script languages or fledgling languages for commercial-grade applications. This is why Java is a Boeing 747 as far as languages go. Great language design, proven robustness, solid standards, and a great body of knowledge and experience has been amassed.
I think Bill is right in some respects, but as to writing in 'safer' languages - show me one which is inherently safe. Even using Perl's taint mechanism is still hard. Also, drink-driving is black or white. Again, show me a program which is said to be 'completely safe' compared to a program which hopes to be so
New Zealand (nee England)
Java seems to be one of the safest and most elegant languages (provided the underlying VM is any good), but the lack of shared-hosting Java servers keeps the SME market and individual coders from being able to deploy it effectively at reasonable cost. Thus there's not the incentive to learn. The textbooks are sitting under my desk at the moment, while I hack VB.
Developing reliable software in C is akin to shaving your beard with a chainsaw. It will (roughly) work and will require continued patching. Many of the errors used by hackers to exploit security vulnerabilities involve code for managing the computer's memory. In the C language there is no support for this task and the programmer has to take care of all the tedious tricky details and often gets it wrong. For this reason researchers and some sectors of the computing industry have developed programming languages and design methods for developing reliable software. Examples include software used for flight control systems used in military and commercial aircraft. After all the last thing you want to see on a fly-by-wire system is a blue screen. However, the use of such languages and methods requires significant changes in how we educate software developers and how industry manages such projects.
Some argue that it will also make the software development process more expensive. However, this claim is questionable since most of the cost of software development is incurred during the debugging phase. The industry standard is to write terrible quality software and then bash it into shape through a "testing" phase. It would be better to produce something that is close to working in the first place and then perfect with testing. We have languages and design methods to support this style of software development. It may even be cheaper to retrain developers to use these advanced approaches if it makes the testing phase shorter.
My 24 years of software development have made me wary of unrefined tools like C and I now look to advanced programming languages like Haskell to help me totally avoid many of the errors that are commonly incurred by developers in C. Hopefully as consumer demands for quality software grow this will result in software vendors updating their development technology and methods. Even Microsoft is switching its development efforts to the C++ language which makes memory management easier and is also using its research labs to investigate approaches for finding the bugs in device drivers that cause the dreaded blue screen. Let's hope this trend continues. Now, where did I put my razor.
Satnam Singh, San Jose, California, USA
I wholly agree with Bill Thompson and the comments from Mr Singh, who is quite on target with both comments and critique of the software industry. It is no surprise that software quality is as bad as it is, as the business model requiring regular releases with ever more features on a frequent basis, means no sooner than the majority of bugs are found in one version, than another version with more features and new, untested, code must be released to sustain revenue growth for investors.
There is an almost religious reluctance to open up proprietary code for systematic third party code auditing, meaning that security bugs are often found either by accident or by laborious reverse engineering. Combine this with hectic release schedules and the the probability of finding the majority of security bugs rapidly before software goes to market rapidly approaches zero.
It doesn't matter how much effort you put into making a system secure or bug free - it's simply an impossible task. Planes crash and buildings collapse. All of those have gone through rigorous design and testing, but they can and do fail. There will always be a way to get into a system, and there will always be bugs in a system, no matter the effort.
Michael Leaver, Singapore
Disclaimer: The BBC may edit your comments and cannot guarantee that all emails will be published.
Bill Thompson is a regular commentator on the BBC World Service programme Go Digital.