Ethical issues are tripping up tech firms and the backlash can be abysmal
As we race towards new tech there is a need to listen to the philosopher’s take on the impact of new ideas
Luciano Floridi: “But we need an ethical understanding, so that can inform the political and legal side, which can then regulate digital technologies in Europe.”
When it comes to thinking through particularly difficult conundrums in the world of business and technology, organisations might want to consider bringing in a special kind of consultant: a philosopher.
“Once you start to get into strategic territory, and a conceptual understanding of things, philosophy is the best tool you have,” says Luciano Floridi, professor of philosophy and ethics of information at the internet institute at University of Oxford.
“Not philosophy in the academic sense, but as in thinking clearly, in a logical way, about the solutions that you have, and the solutions that you need to design, and how one solution compares to another. To me, philosophy is conceptual design.”
That view sits at odds with those of many of his more traditional colleagues, he acknowledges, but then, so does his embrace of philosophy and technology. Yet the two areas seemed an obvious pairing for Floridi. “I realised some time ago that if you wanted to talk to our time today, you had to talk of today,” he says, speaking from his office in Oxford. All the well-known philosophers, from Plato and Aristotle to Locke, Hobbes and Kant did so in their own times, he notes. Today, in what is routinely termed an ‘information society’, philosophers need to engage with technology.”
And, it seems, technology needs to engage with philosophy.
Google did so, in order to think through how to comply with the European Court of Justice’s recent ruling that the company, as well as other search engines, must offer a “right to be forgotten” by removing links to outdated or incorrect information about individuals.
Thorny challengeFloridi was the sole ethicist and philosopher appointed to Google’s advisory committee on the right to be forgotten, where he helped steer discussion through the thorny challenge of balancing the right to privacy against freedom of speech.
A philosopher could also offer useful advice to help companies think through strategies for complying with data protection legislation, he notes.
All such questions can be approached with a three-part philosophical process of analysis, synthesis and evaluation. Or, to put it in a more business-friendly context: identify the key issues, work out a set of solutions, then make sure the solution chosen is the best of the batch, he says.
And what do companies make of a philosopher of technology?
“I’ve only encountered a lot of openness,” Floridi says cheerfully.
He thinks that’s because there’s now much greater awareness of ethics as the large-scale framework in which corporate decisions need to be considered.
Years ago, business choices were primarily contained within the limitations of what was allowed by law. Floridi gives as an example, Microsoft’s decision to include its Internet Explorer web browser as the default browser in its Windows operating system, a move that eventually led to a years-long antitrust battle with the US department of justice.
“Now, if the corporate world wants to go ahead [with a course of action], it has to deal not just with law, but with ethics. The backlash in the long run – if you get the ethics wrong – is abysmal. It can break a company.”
Yet, he says, for the public “there’s no obvious, immediate perception of the importance of these issues, not as there is with [ethics in] the nuclear industry, or animal rights”.
That’s because we are only just beginning to figure out what it means to live an online life. “We have a lower threshold of suspicion” of companies and are more willing to give up our data, he notes.
A younger generation is likely to be more suspicious, and more demanding of companies, he says, dismissing the commonly held belief that younger people are actually less concerned about privacy.
“We know from all the studies we do, that this isn’t the case.” Part of the problem is that people don’t understand what companies do with their personal data, he says. Especially with social media.
“I wish people were a little more careful in how they’re using them,” he says. The problem is that the social media companies make it compelling to share information, and difficult to understand how it might be used.
“If you give social animals social food, they will get fat,” Floridi drily observes.
But there’s nothing intrinsically wrong with social media – “that’s like blaming chocolate for being chocolate” – it’s just that our leap online has happened so fast that we don’t quite grasp the implications, nor have we formulated an ethics for that world.
“But we’re getting there,” Floridi adds.
Ethical overviewWould we have a different situation now, if an ethical overview had been woven in to the development of the internet?
“That’s an interesting question. But the people who were involved at the beginning were people very, very involved with thinking about ethics – the military and scientists,” he says.
“So the culture [of the internet] since the beginning was very well acquainted with ethical problems. I think the problem was that the political powers, especially in Europe, didn’t realise what a big deal the internet was.”
Now, Europe is trying to regain the socio-political upper hand, having previously left the whole environment in the hands of the corporate world, Floridi says.
“But that is so much not the issue. The real issue is, how do you re-establish an accountable, elected, socio-political control of the internet – of what is really, the essential blood of our society. That doesn’t seem to be very clear as a problem.”
Many of the current tensions between the US and EU over issues such as corporate taxation, data protection, and privacy stem from such concerns.
Floridi was recently appointed to the EU’s new ethics advisory group, set up by European data protection supervisor Giovanni Buttarelli to consider some of these fraught areas, within the ethical dimensions of data protection.
The group has 18 months to tease out the relationships between privacy, business models, technology, human rights, and markets and their implications for privacy and data, before reporting back with a white paper.
“The remit is huge. The responsibility is huge,” Floridi says.
“But we need an ethical understanding, so that can inform the political and legal side, which can then regulate digital technologies in Europe.”