After Microsoft announced it would work with the UK National Archives to help open old digital document formats, Georg Greve and Joachim Jakobs, of the Free Software Foundation Europe, question the US giant's motives.
Are we in danger of losing meaning in our inscriptions?
Today's customers drive the technological development of tomorrow. This insight is common sense.
But when the same customers pay one and the same company for first creating a problem and then pay them again for solving that problem, most people would expect the customer to be dissatisfied. Although, at least some people seem to be pleased.
The problem: Microsoft dominates the desktop and office market with a share of more than 90%. Any document stored in their proprietary binary formats and especially every document shared between multiple people strengthens the monopoly and harms competition, economy and society as a whole.
The more widely these formats are being used, the higher the network effect forcing others into the same dependency - just as it happened to the UK National Archives.
What happened: Microsoft asked the UK National Archives to invest in a solution that would grant access to their legacy data.
Only last week BBC News reported on Mr. Gordon Frazer, managing director of Microsoft UK, who voiced concern that customers could lose their own data: "Unless more work is done to ensure legacy file formats can be read and edited in the future, we face a digital dark hole."
This is a surprisingly honest statement from a company that is the largest provider of incompatible and undocumented legacy file formats in the world.
The best solution Microsoft can apparently offer is to "emulate" the old versions of Windows under the current version of Windows Vista.
Indeed some libraries and museums may want to offer an idea of the previous ages of computing, and not all of them may want to offer the fully authentic experience of running it on the old hardware to get the original "look and feel" of bygone times.
But are the UK National Archives primarily a museum dedicated to preserving the original experience of ages and technologies long past? Or are they focused on archiving the knowledge, thoughts and ideas of the generations we build upon?
The broad audience may not want to read Caesar in the hand writing of a particular scribe on the original clay tablets or skin.
Images of them would normally be sufficient, although indeed most people would prefer a transcription on paper or screen may be sufficient.
Even more people are probably served best with a good translation. File formats are the equivalent of the transcription, they encode the original writing into a form for storage.
This idea is not new. Humankind has always sought to preserve its knowledge, as is documented by clay tablets, scrolls and cave paintings of ages long past.
But while the storage medium can last for a very long time, sometimes the meaning is lost because the key to the information is lost.
In modern terms: We no longer know the encoding used for the cave paintings.
Digital information could potentially be stored without loss of quality for a very long time to come.
But without knowledge about the encoding, our documents will become a meaningless series of ones and zeroes to future generations, just like cave paintings are too often meaningless bits of colour on stone to us.
The best way to preserve the encoding is to spread it as far as possible, to make it a public good that is preserved with the same or higher diligence than the encoded information itself.
At best, there is currently only one company that knows exactly how it has implemented its proprietary legacy file formats.
If Microsoft had used Open Standards from the moment it was founded in 1975, this problem would not exist.
In fact, the users of GNOME Office, Koffice or OpenOffice.org would have no problems reading documents written by users of Microsoft (MS) Office.
As it is, the stability of the encoding completely depends on the future existence and behaviour of one company.
Thanks to the co-operation of many companies that find themselves in strong competition, but understand the necessity of preserving the encoding, there is an Open Standard for office documents: the "OpenDocument format" (ODF), which is maintained and further developed by OASIS, an international e-business standardisation organisation, and has been certified by the International Organisation for Standardization (ISO).
Microsoft has said it has its own open format, called MS-OOXML. But
there are serious doubts whether MS-OOXML can be considered an Open Standard: Like a Russian doll, it wraps a number of legacy formats like "Word95" or "Word6", which are not publicly available and can only be implemented by Microsoft.
Another issue is that OOXML may be subject to patent claims. Ultimately the development of the format depends completely on the future existence of one company. Can we bet our future on Microsoft to exist in 4007?
The impact of such dual standards was recently explained by Open Forum Europe, a business association with members such as Fujitsu Siemens, Hewlett Packard, IBM, Intel, Novell and Sun.
Their conclusion was to back ODF: "Multiple Open standards in the area of Interoperability are unwelcome, costly and impractical for both users and suppliers, and will be rejected by the market."
The public needs to understand: As long as only Microsoft can write software that will be able to make use of the full extent of the predominant office file format, Microsoft will remain the predominant vendor for lack of alternatives and competition.
In order to make MS-OOXML the predominant file format, Microsoft is now seeking approval through ISO for its format, expecting its market dominance and global lobbying efforts to coerce a sufficient amount of national standardisation bodies into approving MS-OOXML at ISO.
We have laid down six questions we want Microsoft to answer - but the key one is this: Why did and does Microsoft refuse to participate in the existing standardisation effort?