Why technological privacy is an economic matter

Public now more aware of how online data can be misused, says Alessandro Acquisti

Alessandro Acquisti: his Ted talk on privacy has had more than a million views. Photograph: Joshua Franzos

Alessandro Acquisti: his Ted talk on privacy has had more than a million views. Photograph: Joshua Franzos


“Dublin literally changed my life,” says Alessandro Acquisti, a former Trinity College Dublin student turned high-profile Carnegie Mellon University professor in the now very pertinent area of the economics of privacy.

These days, he has plenty of titles: professor of information technology and public policy at Carnegie Mellon University (CMU); an Andrew Carnegie fellow; director of the Peex (Privacy Economics Experiments) lab at CMU; and co-director of CMU’s Center for Behavioral and Decision Research. His research into the alarmingly sensitive details that can be revealed when disparate data about us on the internet is connected up and combined with other public databases has seen him featured on prime-time news across the US. His TEDed talk on privacy has been viewed more than a million times.

But it was his time at Trinity in the 1990s, first as an Erasmus scholar and then as a master’s degree student in economics, that changed his focus away from music towards a career in academics. (He had worked as a classical music producer and label manager, and as a freelance arranger, music writer and soundtrack composer for theatre, TV and independent film.)

His Trinity years “turned into a big change in focus”, he says, towards thinking about the links between three quite different areas: economics, technology and privacy.

Acquisti, who was in Dublin last week for the 27th Workshop on Information Systems and Economics (Wise) international conference, says it was a PhD at the University of California Berkeley that cemented this interdisciplinary interest, which still remains unusual, and even somewhat controversial, in the field of economics.

“My original work was on the economics of artificial intelligence,” he says. “Then my research took a completely different direction” after a UC Berkeley class that focused on privacy issues, which fascinated him. The public internet was a new thing, the dotcom era was blasting off, and, “like everyone at Berkeley or Stanford”, he found himself involved in a startup, one that was trying to provide anonymous payments to online consumers.

“We had good technology, but we couldn’t solve the problem of who would actually pay for the service,” he says in a strong Italian accent.

Privacy paradox

He’d inadvertently stumbled across the famed “privacy paradox” that would shape his career.

“Consumers would claim they wanted privacy, but were quite willing to share data with strangers and disliked to pay to protect it,” he says.

As his papers moved from pure economics to behavioural economics, his work began to gain attention, including from two of the biggest figures in technology and security, American cryptographer and security expert Bruce Schneier and Cambridge University security engineer Ross Anderson. The latter’s famous 2001 paper, Why Information Security Is Hard (see https://cyber.harvard.edu/cybersecurity/Why_Information_Security_is_Hard), was one of the first to articulate an economics of infosec, arguing that the problems of securing information were perhaps best thought of in economic, not technical, terms.

Schneier, Anderson and Acquisti set up the (still ongoing) Interdisciplinary Workshop on Security and Human Behaviour in 2008 at the Massachusetts Institute of Technology, bringing together specialists who had until then rarely stood in a room together: computer scientists, philosophers, mathematicians, economists, sociologists, law experts, security engineers, and others.

But Wise, with its direct connection to his core interests of economics and information systems, seems especially close to Acquisti’s heart.

“Initially, Wise was alone, but over the years you started seeing papers on economics and privacy at other major venues,” he says.

Computer scientists were quicker to embrace the combination of economics, technology and privacy “because being interdisciplinary was not uncommon in computer science”. Convincing economists, and university economics departments, has been a slower process. Interdisciplinary work “is seen as a risk in economics”, with many university departments and mainstream economics journals eyeing such work with suspicion.

It helps that the general public, as well as policymakers and the courts, are beginning to see that privacy has economic value.

Public awareness

Acquisti says two developments have created greater public awareness of privacy as a value. First, the growth of social networks, especially Facebook, which requires people to use their first and last name.

“That started a trend, that data could be tracked, and elevated privacy concerns,” he says.

Second, people better understand that personal data has economic value to companies and governments.

“The economic trade-off became more relevant,” he says.

Acquisti ended up in the public eye himself when his studies attracted major attention in the US media. His research sought to understand what might be done with the data we leave exposed on social networks, typically because we leave them on the most open default privacy settings.

In one experiment, Acquisti and his students created fictional profiles on LinkedIn and Facebook, in which they varied the gender, sexual orientation and religion of fake individuals, who then submitted job applications to various companies. They did not disclose the gender, religion or sexual orientation of the fictional individuals in the applications.

“And then we waited,” he says.

The study found that companies were clearly looking at Facebook and LinkedIn for information on job candidates, as it influenced employer responses. In particular, the more conservative a US state, the less likely a non-Christian candidate was to get called for an interview, he says.

Another study, discussed in Acquisti’s Ted talk, showed that “we could also predict the social security numbers of people” by using data found about them online and correlating it to a public database of “dead” social security numbers, numbers no longer in use because people had died. US social security numbers are unique identifiers, highly sensitive information used for personal records.

The study showed that in the time it took to video a volunteer taking a survey, researchers could upload video captures of the individual, then use a facial recognition algorithm to search for 10 potentially matching photos online. A third of the time, they got direct matches.

They then showed that they could take a random facial image online, run it against images in social media profiles, and use facial recognition to produce an exact identity match of name, date of birth and location information. The “dead” database then helped them to correctly predict a social security number.

“To the question, can you predict someone’s social security number just from a face, the answer is yes,” Acquisti says.


The studies “show the increasing sophistication of statistical techniques to find connections in data”, with the result that once seemingly innocuous information must be considered potentially “sensitive data”.

Acquisti says he doesn’t have any direct research to show it, but he feels this work, and his Ted talk, “have helped in making policymakers and the public at large aware of privacy issues”.

He hopes so, because while “as a researcher my first driver is curiosity and my first audience is myself”, he chose his area of work because he also was interested in public policy and hoped it might have some direct impact.

In particular, he hopes he has demonstrated the inadequacy of “notice and consent” privacy policies, which tell people what data is being gathered by an app or website (often in pages of legalese) and then presume consent.

He is adamant, also, that privacy is not, as some have argued, a modern concept.

“In the course of writing a review in Science, I found evidence of privacy-seeking behaviour across cultures and time, including the ancient Greeks and Romans, in Java and Mali, among the Tuareg people, and in the Koran, the Torah and the Bible,” he says.

The Bible “starts with story of privacy”: Adam and Eve are at first naked in the Garden of Eden but, after eating the apple, they cover themselves with fig leaves.

“So tell me, how is privacy a modern anomaly?” he asks. “Privacy is simultaneously culturally specific and culturally universal – and that’s wonderful.”

The Irish Times Logo
Commenting on The Irish Times has changed. To comment you must now be an Irish Times subscriber.
Error Image
The account details entered are not currently associated with an Irish Times subscription. Please subscribe to sign in to comment.
Comment Sign In

Forgot password?
The Irish Times Logo
Thank you
You should receive instructions for resetting your password. When you have reset your password, you can Sign In.
The Irish Times Logo
Please choose a screen name. This name will appear beside any comments you post. Your screen name should follow the standards set out in our community standards.
Screen Name Selection


Please choose a screen name. This name will appear beside any comments you post. Your screen name should follow the standards set out in our community standards.

The Irish Times Logo
Commenting on The Irish Times has changed. To comment you must now be an Irish Times subscriber.
Forgot Password
Please enter your email address so we can send you a link to reset your password.

Sign In

Your Comments
We reserve the right to remove any content at any time from this Community, including without limitation if it violates the Community Standards. We ask that you report content that you in good faith believe violates the above rules by clicking the Flag link next to the offending comment or by filling out this form. New comments are only accepted for 3 days from the date of publication.