More accountability in handling personal data is essential

Helen Dixon: Microtargeting can have sinister effects in free, democratic society

Whistleblower Christopher Wylie, who helped set up data analytics firm Cambridge Analytica, says that he believes the company developed techniques capable of having an impact on elections. Video: Reuters

Growing up in pre-internet Ireland, I was fascinated with a 20-volume edition of the Encyclopaedia Britannica that lined the bottom row of the bookshelf in my family's sitting room. I was particularly interested by its predictions of what humans would look like by the year 2000: dressed in one-piece lycra suits, and with much larger, hairless heads. It's clearly hard to predict the future. But we have certainly arrived in a world that would seem like science fiction to earlier generations.

Today the mobile phone in our pockets tracks our every move, as do ubiquitous CCTV systems in every premises and on every street corner. The IP address on our computer uniquely identifies our actions online: our purchases, our favourite music, our social engagement and the stories we read online. These all build a digital shadow version of ourselves. Through tracking technologies like cookies online companies use this data to build a profile and more accurately target us with ads.

Our response to these ads is then tracked so that our profile can be fine-tuned and elicit a better response next time. Data collected from our “real world” interactions are combined and added to our online profiles to build an even clearer picture of who we are, our social circle and what makes us tick. With the imminent arrival of increased society-wide AI applications through the “internet of things” and connected driverless cars, the issues of personal data collection are growing ever greater.

Ad tech

The online “ad tech” industry is worth about €186 billion a year. On the one hand the industry allows “free-of-charge” access to many internet services, but we are “paying” in another sense, since these companies instead use our personal data to sell detailed profiles of us to advertisers. In recent years, a small number of giant internet platforms have collected vast swathes of personal data, analysing and monetising it through this rigorously targeted advertising.


The complicated technology used in “microtargeting” individuals online means the aim of such targeting is often opaque. In many cases, the technology seeks to leverage and even encourage addictive behaviour. This type of manipulation and disregard for ethics is particularly concerning for vulnerable users such as children.

This week, with Cambridge Analytica and Facebook, we saw that the same process could be used to potentially frustrate the democratic process. Concerns have emerged that internet companies collecting and processing personal data to serve personalised advertisements are manipulating individuals, through the collection of sensitive information and controlling the information someone sees on a platform based on their categorisation.

Political microtargeting is a specialised form of microtargeting which ultimately aims to influence prospective voters. This includes traditional methods such as mail, phone and canvassing as well as newer methods such as social media advertising. While the efficacy of political microtargeting in a social media context is unknown, it undoubtedly has the potential to produce sinister effects in a free and democratic society. Misleading or false information (including “fake news”) is problematic for all types of microtargeting but especially for political microtargeting. It is particularly concerning given the size and ubiquity of some of the online services used by individuals in Ireland and indeed throughout the world.

Digital destiny

Where does data protection legislation and the Data Protection Commissioner of Ireland (DPC) come into this? Data protection legislation safeguards the rights and freedoms of individuals when organisations collect, use and deploy their personal data. It aims to give individuals greater control over their digital destiny by clearly deciding when to share their data and when not to.

The issue that arises now is that while each individual organisation in the complicated “ad tech” ecosystem claims lawful justification for their data collection, ultimately it becomes difficult for individuals to know and understand how their personal data is being collected, how it’s being used and who is actually doing this. Many time-pressed individuals unsurprisingly err on the side of convenience and engage with such services regardless.

However, this week's revelations have made many people wake up to these enormous risks. There are a number of actions individuals can take and the DPC has this week published guidance on our website ( on how to control privacy settings on the major platforms, understand why people are seeing certain advertisements, and how to avoid particular ads altogether if they wish. These more granular control settings have been delivered in part on foot of painstaking supervisory work the DPC has undertaken with these major platforms.

Logic of algorithms

From May this year, the new General Data Protection Regulation (GDPR) will force better accountability on all offline and online organisations and require them to be more transparent and fairer in how they are handling personal data. Profiling of individuals will have to be disclosed and people will be entitled to understand the basic logic of the algorithms applied to their profiles and how these dictate what they see online.

But these issues are broad and not just a data protection concern. In fact, the solutions to a fairer deal online for individuals will have to be driven by co-ordinated action from regulators across data protection, consumer (misleading practices), competition (potential abuses of a dominant position) and an urgency from policy and lawmakers to tackle these continually evolving issues on an ongoing basis.

The Data Protection Commissioner will enjoy significantly expanded powers under GDPR, including the ability to levy fines of up to 4 per cent of a company’s global revenue. This, together with the substantial increases in recent years of our budget, staff and expertise, means we are ready to play our part.

Across Europe, and indeed the world, we believe that the GDPR will mark a turning point in unfair use of personal data and we urge policymakers, other regulators and indeed the companies themselves to seize the initiative and move with us in that direction.

Helen Dixon is the Data Protection Commissioner