Subscriber OnlyOpinionUnthinkable

What a Tesla driver in a VR headset can teach us about empathy

Unthinkable: We can’t be sure technology is making people less empathetic, but the signs are not good

As dystopian images of the future go, it doesn’t get much better than a bloke interacting with a virtual reality headset while driving a bulletproof Tesla Cybertruck hands-free. The vehicle is designed, according to manufacturer Elon Musk, to ensure “if you have an argument with another car, you will win”.

The footage, which went viral earlier this month along with similar clips of drivers on autopilot using Apple Vision Pro, prompted safety warnings from transport authorities in the United States. And although they were isolated videos, seemingly filmed as pranks, they tapped into a general unease with Big Tech’s product line. The inequity of access to new devices, the displacement of human agency, and the erection of barriers to physical contact are all baked in.

The Vision Pro retails for $3,499 (€3,250) and looks like a pair of ski goggles mashed with Bono’s fly glasses. A promotional ad shows a father staring at his children bug-eyed while wearing the contraption in 3D-record mode – a sight that will take years of counselling for those kids to overcome.

Of course there are many benefits to new technology, but one downside, it seems, is its impact on human empathy. The evidence is patchy, but an oft-quoted study at University of Michigan found that levels of empathy in American students fell by 40 per cent between 1979 and 2010, based on surveys measuring kindliness.

READ MORE

Empathy itself is quite a slippery concept. In Confucianism, it is known as “jen”, or “human-heartedness”, and is described as “the distinguishing characteristic” of human beings. But it only has practical value when it becomes “li”, something akin to compassion in Christianity.

In both eastern and western traditions, empathy is expressed through crying. Tears were a trademark of Ignatius of Loyola, the founder of the Jesuits – a doctor reportedly told him he’d lose his sight if he kept weeping. Unsurprisingly, Pope Francis – a Jesuit – is big on empathy. (Unsurprisingly too, Donald Trump boasts about not having cried in years. “I like to get things done. I’m not a big crier,” he has said.)

But empathy has its downside, as highlighted by US psychologist Paul Bloom. The problem is selective sympathy; we empathise more readily with those who look like us. Take the Israel-Palestine conflict: empathy is bountiful within each group, but often less evident across ethnic lines.

Some researchers have concluded that there is a real danger, especially to children and people with special needs, from manipulation by pseudo-empathy being mimicked by avatars and chatbots

—  Dr Gabriel J Costello, Atlantic Technological University

One of the most prominent philosophers on empathy was Edith Stein, the German Jewish thinker who converted to Catholicism, became a nun and was murdered in a Nazi concentration camp in 1942. “Stein did groundbreaking philosophical work on empathy and importantly presented it as a two-way process. She advises that it is possible for another to judge me more accurately than I judge myself,” says Dr Gabriel J Costello, a lecturer at Atlantic Technological University.

Central to Stein’s approach is a rejection of what Costello calls the “ubiquitous materialist worldview”, and this makes her highly relevant to debates surrounding technology and Artificial Intelligence (AI). Proponents of transhumanism – the idea that we will be transformed as a species through technological enhancements – tend to think of the individual as a raw material awaiting customisation.

However, according to Stein, “the individual is not given as a physical body but as a sensitive, living body belonging to an ‘I’ that senses, thinks, feels and wills”. She developed a “nuanced and complex view that humans consist of four phenomenal realms of activity: the physical, the sensate, the mental and the personal. Additionally, these realms, while located in the body, are porous and blend into one another.”

That may sound a little kooky, but Stein “had a keen interest in questions of technology that arose during the great advances in physics at the beginning of the 20th century”, says Costello, an engineer with a research interest in the ethics of design.

“She studied theoretical physics during her first year in university. Shortly before her death in Auschwitz, she asked a philosophical colleague to provide her with the latest literature on atomic theory.”

Like Russian opposition leader Alexei Navalny, she was killed for standing against fascism – she could have fled to safety on several occasions but reasoned, “if I cannot share the lot of my brothers and sisters, my life, in a certain sense, is destroyed”.

As to where Stein would stand on new technology, Costello speculates she would be “excited by possible benefits, such as enhancing medical diagnostics”, but with caveats. “Some researchers have concluded that there is a real danger, especially to children and people with special needs, from manipulation by pseudo-empathy being mimicked by avatars, chatbots and other interactive technologies”.

Would a device like Vision Pro concern Stein?

“Her advice might be to carefully read the safety guidelines that come with IVR [immersive virtual reality] headsets before using them, or giving them to your children... Formation and progress in empathy is an intersubjective practice that requires interaction with other humans from a very early age.”

Dr Gabriel Costello is keynote speaker at the annual public meeting of the Philosophy of Edith Stein Reading Group at St Teresa’s Carmelite Priory, Clarendon Street, Dublin 2 on Sunday, February 25th at 1.30pm. Free admission.