Coders lose their way in bid to hide needle in Haystack

WIRED: A software project aimed at outwitting Tehran’s authorities after the last election has run into trouble, writes DANNY…

WIRED:A software project aimed at outwitting Tehran's authorities after the last election has run into trouble, writes DANNY O'BRIEN

THE STORY of Haystack was a simple one. On the night of the Iranian election, Austin Heap, a young San Francisco coder, heard on Twitter about the dramatic stories of the poll. As the unrest grew, the Iranian internet became heavily locked down by the government, with almost all foreign sites blocked.

Heap helped his new Iranian friends by manually compiling and publishing lists of proxies for them. Proxies were computers run by sympathetic volunteers outside Iran that could be used to bypass their government’s internet blockade.

But proxies were being blocked by Iran as quickly as Heap could find them. He began brainstorming a more sustainable solution with a young coder he met online called Daniel Colascione.

READ MORE

What if they built a program that took the encrypted traffic to hundreds of their own proxies and disguised it as normal innocent internet traffic – bland visits to weather.com, and the like?

The Iranian authorities wouldn’t be able to identify the proxies from normal traffic, delaying their discovery. They called their software project Haystack, as in hiding a needle in a haystack.

Heap announced the project and began seeking donations to support his work. I met him. At the time, I worked for the digital rights group, the Electronic Frontier Foundation. The technologists there wanted to know more about Haystack because we were looking at software to recommend to tech-savvy dissidents in situations just like Iran.

Heap was friendly, but not forthcoming. He seemed worried that revealing any details might weaken Haystack’s security.

That worried me. There’s a rule that secure software designers use to avoid building their code on shaky foundations. Coined by information theory pioneer Claude Shannon, it is called “Shannon’s Maxim” and it states, simply, “the enemy knows the system”. You build your software assuming that the enemy knows everything about its design. If you don’t keep that discipline, your whole system becomes broken the moment anything about it leaks out. And designs cannot be kept secret for long.

I urged Heap to consult other security experts so that he could ensure that Haystack did not have bugs that would put his users at risk. Secrets would not be enough.

Over the following months, many others in the security community told me they had made the same offer of assistance, only to be similarly rebuffed. Not only did he not want the enemy to know the system, he seemed to want to hide it from potential friends, too.

But the publicity he was receiving meant the project was anything but secret. It seemed a strange contradiction. Despite the continuing series of glowing reviews of Haystack in the American press, describing it as a brilliant tool of dissidents, no software was available for download. Heap continued to take interviews and publicly ask for donations and USB drives, even with no software to show.

Finally, when Haystack got a glowing cover story in Newsweek last month, the dam cracked. Activists began to complain publicly.

I met Heap again, and finally saw working software. It seemed very primitive. He explained that development on Haystack had been delayed for months by legal export red tape but now it was being used successfully by “under 100” Iranian testers.

A contact in the security community, Jacob Appelbaum, called me. He had read my notes on my conversation with Heap, and some problems with Haystack’s apparent design had occurred to him.

When I told Heap that Appelbaum knew of problems with Haystack, Heap immediately shut down the service. Then Colascione joined the public conversation. He said he was the only one who had written any of the Haystack code, but that he had left the project months previously over disagreements about the secrecy and marketing.

Heap had kept this from me and others – it certainly explained the primitive nature of the code I had seen. But what else was being kept secret?

Then Appelbaum obtained a copy of Haystack from someone whom Heap didn’t know. It was the program Heap thought only his testers had. Another secret leaked.

Things went rapidly downhill.

First, Appelbaum got Haystack to work, even though Heap had claimed he had shut down the service. Worse, the system Colascione and Heap had dreamed of was very different from the software being distributed.

The test Haystack worked to circumvent Iranian censorship – but there were shortcuts which meant this software, aimed at dissidents, was far more dangerous than other circumvention software for dissidents to use.

Worse, neither Appelbaum nor I could work out how to explain what was wrong with the software without giving away exactly the details that the Iranian government could use to exploit the flaws.

We were obliged to become like Haystack’s creators, keeping vague and secretive while hoping no one discovered what we knew about the code’s problems. We knew that if “the enemy” really did know the system as well as we did, people’s lives might be at risk.

But we thought we’d seen enough that we should warn people not to use it.

How can you publicise software and still keep it secret? How do you keep flaws secret, but still let the world know about them?

Now mired in this awful situation myself, I can only recommend to young programmers intent on doing good in the world while keeping their users safe that, like Claude Shannon warned us, you never depend on keeping things secret.

It won’t work for the Iranian censors in the end; and it doesn’t work for those fighting them either.