The private lives of Google

In the mission to organise the world’s information, can citizens’ rights be obscured by innovative engineering, asks Karlin Lillington…

In the mission to organise the world's information, can citizens' rights be obscured by innovative engineering, asks Karlin Lillington

ALMA WHITTEN, Google’s lead privacy engineer and European privacy liaison, has a tough job as envoy for a company that has had its share of privacy skirmishes.

But she calmly goes where many would fear to tread. She gave a thoughtful presentation at Dublin’s Institute for International and European Affairs (IIEA) recently, in which she set out the technical need for various types of information, and outlined Google’s privacy and security policies.

She did not shirk the central conundrum for Google, whose mission statement is to organise the world’s information: “Many of the best things Google does come directly from our mission statement – but also all the most difficult and challenging things around privacy come from that mission.”

READ MORE

In an interview afterwards, she explains how she ended up in her unusual role. “I walked a weird and twisty path to get to this place,” she laughs, noting that both she and Peter Fleischer – Google’s legal adviser on privacy – are high-school dropouts.

She was strongly interested in theatre and acting, and left high school to pursue those interests. Eventually, she circled back to do a university degree in order to make some money in a “normal” job. Her basic degree was in computer science, engineering and anthropology and, by the time of doing a PhD, she found herself interested in security.

“I went to a security conference and it was sort of openly acknowledged that 95 per cent of the time when security fails it’s because humans find it too confusing. But everyone works on the 5 per cent” of solving technical problems. She talked with her adviser about doing some research on the 95 per cent problem – the “people” side.

Out of that came a now well-known paper entitled Why Johnny Can't Encryptwhich got Whitten mainstream media as well as industry attention.

“There’s a degree to which the paper made a big splash,” she says with understatement, noting the title – a change from its duller, engineering-oriented original – helped. She presented the paper at a conference, then “woke up next morning and had 250 job offers in my inbox”.

The paper pointed out the security industry focused on the capabilities of technologies, rather than being aligned with what people wanted to do.

“What people care about is not, ‘is my e-mail able to be encrypted?’, but, ‘am I opening myself up to the risk of being stalked?’,” she says.

Several friends already working at Google encouraged her to apply for a job there and she was the first hire in its new security group. Within a year, she was the lead engineer.

Much of her role focused on log data – data that specifically identify elements of a given search – “so I had a lot of immediate work on that area of security that overlaps most strongly with privacy”.

She always had an interest in living in Europe and jumped at the chance to be based in the UK, from where she brings a European perspective on privacy and security issues back to Google HQ in California, and serves as Google’s main liaison on privacy to the EU. “I give the German or the Irish perspective in terms that makes the most visceral sense to California engineers,” she laughs.

It must be a hot seat position, given the EU’s confrontational stance taken with Google in recent years, arguing for reductions in the time Google holds data and debating whether IP addresses – the numerical address assigned to a computer during a session on the internet – constitute personal information.

Whitten has been in the middle of that argument. In 2008, she made a post to Google’s public policy blog entitled “Are IP addresses personal?” in which she argued they were not. She received over 50 comments, most hotly debating her position.

She says she now regrets the post in some ways. “I think I’ve retreated to an engineer’s perspective,” she says.

“IP addresses are quite private and sensitive and we’ll always treat them as such. One of the things that went wrong with the blog post was, in an attempt to define the legal category of ‘private’, I perhaps conveyed that I had a ‘move along, nothing to worry about’ perspective.”

In an engineering sense, as she argued in her IIEA presentation, IP addresses would not be considered personal data in the way software uses them in, say, security software running a firewall and protecting against network attacks.

While the IP address debate continues, a more recent privacy blunder on Google’s part was its initial release of Buzz, a social network built around Gmail webmail contacts. The default set-up initially meant a user’s e-mail contacts were included and could be viewed by other contacts.

Scalded by angry user reaction, Google apologised and – to its credit – immediately addressed these issues. But how could it get such a major privacy issue so wrong?

“It did take us by surprise – and it was a really unpleasant surprise and we don’t want it to happen again,” she says. But given Google’s front-line visibility on privacy issues, was Buzz not put through a careful privacy review?

“The issue was not the lack of a privacy process,” she says. “Partly it was that we had used an internal version, and had become very comfortable with it and felt it clearly filled a niche between e-mail and chat. It slotted in very naturally with ways you’d want to communicate with your contacts.”

It was this very closeness to the service, the comfort level, “that seems most likely was the contributor” to what happened, she says.

New technological developments offer constant debate for the company over the balance between cool services, security, and privacy. One looming challenge is the ability to search with pictures, using facial recognition. Someone could take a picture of a person with a phone’s camera, then do a facial search to find out who they are.

For now, Google has not enabled this type of search and will “have a conversation about that” as the technology develops. Yet Google recognises many people will want to blend searches with location-based services and other technological possibilities. “It’s the little brother versus Big Brother scenario. This is coming and we have to think about it.”

Is there too much Google focus on the usefulness of services, perhaps, and too little on the possible privacy impact? “It doesn’t feel to me like there is, I think because we have a sophisticated understanding of what it means to be ‘useful’.

For engineers, useful is almost a holy word – it evokes goodness. I think if we get ‘useful’ right, we have also taken the time to get privacy and trust right. And I hope that trust is a bridge that leads to Google.”