Wired on Friday Danny O'Brien Keeping the back door shut

Last week, Microsoft emphatically denied something most have us have never even suspected

Last week, Microsoft emphatically denied something most have us have never even suspected. There is, Niels Ferguson, developer and cryptographer at the company, announced, no government backdoor in Vista, the next version of Windows. In other words, if you use the secure "BitFolder" feature of the system to protect your data, it will be as difficult for the authorities to decode it as anyone else.

Quite why anyone would even suspect Microsoft of such a trick is a somewhat haphazard tale. The rumour arose from the comments of Ross Anderson, a noted British security researcher, to the British parliament's home affairs select committee. The BBC reported that he had suggested that the new version of Windows had encryption so powerful that it would pose problems for the authorities, and that the government should discuss with Microsoft ways of bypassing them for law enforcement purposes.

The truth is rather more complicated - both in the point of Anderson's testimony, and the state of secrecy on your home computer. The truth is that, in theoretical terms at least, modern computers have been able to hide data so well that governments cannot break them without discovering the original password for close on 15 years.

This fact has remained an awkward one for states to accept, and has been the subject of much hand-wringing and wrangling by politicians and bureaucrats ever since. The United States tried to class such powerful encryption as a munition, to prevent such awesome power from falling into the wrong hands.

READ MORE

The British government, over the objections of experts such as Anderson, has passed laws making it a crime not to reveal your password to encrypted data.

Nonetheless, despite the best efforts of governments to hamper such encryption, its use has spread far and wide. It protects your financial transactions online and it encodes your private mobile phone calls.

While it remains theoretically unbreakable, most government law enforcement experts realise that there are simpler ways to crack computer data than attempting to untangle its codes alone. Passwords can be easily guessed. Traces of the decoded data may be left on machines. In the very worst of cases, and under warrant, extra software may be added to a suspect's machine to reveal the passwords and programs they use to conduct their nefarious work.

No one would suggest introducing such backdoors into commercial software. Well, not any more, at least. There have always been rumours that early software had such code, added at the behest of the US government. No evidence for such trickery has been found, which is odd because such evidence would be almost impossible to hide. Modern computer software can be heavily obfuscated, but it is nonetheless relatively easy to decode and understand in parts.

At worst, you can learn through doing: by running a computer program and watching it for telltale behaviours. If a program was, for instance, recording your every key stroke and sending it to spooks at the US National Security Agency (NSA), a competent software analyser would be able to spot it, either in the traffic it sent over the internet, or in the steadily piling store of keypress data it would have to squirrel away in its memory.

Such trickery can be hidden, but it will be hidden in plain view.

Microsoft themselves suffered a terrible publicity blow a few years ago when software analysts spotted a data structure in Microsoft's standard encryption code called "NSA_KEY".

As it turns out, this was not a secret skeleton key inserted by the NSA but a request by them to comply with the US's own encryption standards.

The truth is that - affidavit or not - Microsoft software cannot hide that much, because it is publicly viewable once bought.

No, when spies and spooks need to insert backdoors into computers, they do it the old fashioned way, with a cloak-and-dagger break-and-enter. Or, at their most sophisticated, they use some of the accidental security flaws that most pieces of modern software suffer from.

What Anderson describes is an escalation in the war between law enforcement and our increasingly invulnerable machines. The latest versions of Vista will protect customer data that uses a technology known as Trusted Computing, a system that, ironically, is particularly untrusting. Under these conditions, not even the government will be able to insert additional code to monitor or compromise criminals, once again limiting their avenues of attack.

Anderson has repeatedly warned against Trusted Computing, for perhaps the same reason as the governments of the world might find it attractive. As part of its design, Trusted Computing also locks the user of the computer out of its security system. Under certain conditions, it can run code without either user or government being able to decode or predict what the code might do.

This part of Trusted Computing's design has been mainly put in place so that software and media creators can copy-protect and tamper-proof their products. But it also has the potential to hide such tricks as government backdoors and watchdogs. Trusted Computing may be too hard for the government to break, but it's also the perfect place to insert government software without leaving it in open view. Anderson may be somewhat sly in his suggestion. Generally concerned by the risks of Trusted Computing, he may realise that it will be too tempting a target for law enforcement.

And with government backdoors in place, it will also be effective.

Danny O'Brien is activism co-ordinator at the Electronic Frontier Foundation