It takes more than just ticking boxes to protect data
There is still little advice on how firms can deal ethically with personal data
There is concern about the potential explosion of personal data processed by the Internet of Things
Research on how organisations can deal ethically with personal data has been around for a good two decades, yet there is still little advice on how they can put this into practice.
This is the contention of Robin Wilton, a leading expert on digital identity, privacy and public policy, who was in Dublin last week to give the latest in a series of public lectures on ethics and privacy at the Adapt Centre for Digital Content Technology at Trinity College Dublin.
Wilton, the technical outreach director for identity and privacy at the Internet Society, will be back in Dublin shortly for a workshop to explore the practical issues.
He says organisations need to go beyond box-checking and compliance exercises when the volume of personal data is rapidly increasing and its potential uses are multiplying in ways as yet unimagined by the ordinary consumer. “What I’m trying to do is to come up with a good, clear problem statement, secondly, to look at resources that already address this or parts of it and third, to put those resources somewhere and make them easy to find and easy to use.”
Wilton says compliance, and an awareness of data protection laws and the forthcoming EU regulation, are “good and healthy things to have, but when you put them into practice. What I’ve noticed is companies often end up with a risk-management checklist.
“Once they’ve checked every box, why they are doing that stuff tends to matter to them less than the fact they have got a list to show to the regulator. That can lead to profoundly unethical behaviour, because they’ve covered their risk.”
He says the perspective of an IT department is not the only one that needs to be considered when dealing with personal data. “If you go to an IT department, their concern is how to make things work, not how to do things right (ethically).”
The challenge is to produce guidance on data ethics available to anyone even if they do not have someone employed to deal full time with ethical issues.
“I think there will definitely be sectors where it is literally of vital importance. However, I do think that gap is narrowing, and things that previously wouldn’t have been thought of as sensitive and potentially damaging personal data do actually fall into that category because of business aggregation.”
Wilton says that before the Snowden revelations, his work was mostly focused on commercial exploitation of personal data. Governments were trusted to “do what they needed to do”.
However when people found out the extent of what governments were actually doing, they began to see the gap between intelligence-gathering and what the law was saying about necessity and proportionality principles around the use of such personal data.
There was then a shift, he believes, from seeing commercial organisations and commercial exploitation of personal data as the principal threat, to a recognition that it was in fact “only one of a number of threats”.
Asked where the greatest threat to digital autonomy comes from now and whether it is from governments or from the corporate sector, Wilton suggests it is spread across them “in slightly different ways”.
He expresses a particular concern about the potential explosion of personal data processed by the Internet of Things.
“As the technology changes, companies will adapt to take advantage of that. They can do that far faster than governments can react.”
He notes some companies have social responsibility programmes in place. “Many recognise, for example, in areas like ecology, their responsibility with regard to things like environment, energy usage, carbon footprint and so on.
“So to that extent I think you can extend that sort of sustainability and ethical approach into the handling of personal data as well.”
Asked whether business models are ultimately doomed to fail if they do not respect ethical frameworks and fundamental rights, Wilton says: “I’d love to think so.
“It’s very hard to come up with a compelling business case for ethical behaviour, other than to say that unethical behaviour tends to be effective in the short term. But when people find out there’s a gap between what you say and what you do, that tends to have a negative consequence.”
Despite data protection laws, the aggregation of personal data and the use of clusters of big data to draw inferences about people is “largely unregulated”, Wilton says.
“We are less and less in the position where you are only affected by data about you, and increasingly into a position where you are affected directly by data that has been collected about other people’s behaviour,” he says.
“It’s in the interests of Twitter and Facebook and Google and so on that when you’re in that social networking context to foster the impression that you are only talking to your buddies and that there isn’t also a third party in the room who has a direct interest in monetising everything you do. So that to me is the big lie of social networking.”
“Social networks want people to think their discussion is as unfettered as possible.
However, he adds: “In the playground, you soon find out which of your friends is likely to tell your secret to someone else. And that applies online as well.”