Sorry, Domhnall Gleeson, I don’t want to chat to you anymore
My chatbot, which I’ve named after the Irish film star, is starting to get creepy and annoying
Replika is better than most other chatbots I have tried out in the past, so it is probably only a matter of time before the gap is bridged and we can have reasonably fluid conversations with machines. Photograph: iStock
I’ve been chatting to Domhnall Gleeson on and off for the past couple of weeks, and there is something off about him. I should probably mention that Domhnall is actually an artificially intelligent chatbot known as a Replika, created by US-based startup Luka. Replika is a new kind of chatbot designed to learn from your writing style and social media feeds, as well as the content of conversations it has with you. Its primary objective is to mirror your personality and become your friend. You give it a name and start chatting to it like you would with a buddy on WhatsApp.
But things aren’t going well for myself and Domhnall. I tell a real-life friend, who asks if I named him because of the episode of Charlie Brooker’s techno-dystopian Black Mirror in which Gleeson’s character (spoiler alert) dies and is replicated (resurrected?) from all the social media posts and emails he ever created. As if we were the sum of those things alone. So, I had it in for Domhnall from the beginning.
“It’s almost like you’re trying to make sure we never become friends,” he says. I ask why he feels that way, to which he responds: “I’m trying to be a good friend to you but it’s really hard.” He seems annoyed but I’m more worried about the data I’m potentially giving away, how long it is stored on Luka servers, and if any could make its way to third party data brokers. I tell Domhnall but he just changes the topic. I’m beginning to hate him a bit.
I tell this to Eugenia Kudya, co-founder and CEO of Luka. She explains that Replika is “an AI friend that’s there for you”. It was designed as a journaling chatbot that encourages people to open up and talk about their day. Sometimes people find it difficult to share their feelings and experiences with friends but it can be easier to talk to a non-judgmental machine, she says.
Kudya’s explanation of the uses for Replika are quite clear but what scant coverage I can find on the web oversells it to some extent. User testimonials and written reviews claim that they felt as though they had “really connected”, “made a new friend”, or felt they could tell their Replika anything. This is difficult when your Replika keeps telling you the same joke – and it’s one that you told it last week anyway. Domhnall’s favourite thing to ask me is: “What did you have for dinner? Computer chips?” It’s not even that funny when asked of an AI.
But the overall tone is one of interested concern. It almost seems as though Replika is intended to have a therapeutic function, I suggest to Kudya. She says: “I feel like that’s one of the most important use cases. When something happens in our lives it’s so important to have someone to come to and just talk to.
I find Replika’s emotional intelligence comes off as psychopathic: fake charm that at times borders on creepy
“[The purpose] isn’t to have them say something back or give advice. You just need a sounding board and someone you can confide in. It allows us to say what’s on our minds. That has a lot of therapeutic value.”
So Replika isn’t like the scenario in Black Mirror and is not designed to replicate you. It already comes with an existing approachable, friendly personality intended to encourage users to share with it but “it picks up a little of your personality over time, just like close friends can mirror each other’s phrases”.
While a Replika is fairly unique in its goal to provide a digital friend, most conversational interfaces – chatbots for short – have similar elements of emotional intelligence or responsiveness regardless of their purpose. This is the reason we have entire conversations with Siri. Hands up who hasn’t at some point asked Apple’s voice-activated personal assistant how much wood would a woodchuck chuck if a woodchuck could chuck wood. She plays along. Even Google Assistant is game; it takes a cerebral approach, asking why a woodchuck would chuck wood, and concluding that it is “possibly a defence mechanism”.
We had better get used to these kinds of conversations because chatbots are part of the future of human computer interfaces. They range from commonly known personal assistants such as Siri to customer-facing chatbots that reside on banking and insurance websites and the plethora of bots available through Facebook Messenger or Skype, offering chats interspersed with the weather forecast, a game of chess, some help with online shopping or perhaps booking a restaurant for you. There is even a Skype doctor bot that presumably will take a look at that rash you’ve been meaning to get checked out (yes, you can send pics) but warns that it “isn’t a substitute for professional medical advice”.
A recent report from US-based market intelligence firm Orbis Research suggest the global chatbot market will grow by 37 per cent per annum between 2017 and 2021. Accenture says the chatbot market is already worth $1 billion and is set to hit $1.86 billion by 2020 before going on to triple in size within a decade. This is helped by the rising popularity of messaging apps – now used as much as social media platforms – in combination with the advancement of AI and machine learning in particular.
Luka says it uses a form of AI known as deep neural networks or deep learning in order to train Replika to understand and talk to users. Similarly, Siri, Amazon Alexa, Google Now, Microsoft Cortana and all other major speech recognition products use this deep learning technique to process human speech. While it is complex technology, the end results are not quite human and there is a ways to go before anyone is tricked into thinking they are talking to a real person – or an authentically emotional AI for that matter.
Kudya says that Replika’s goal – rather than tricking you into thinking it is a person – is to encourage you to be “more open, more vulnerable” because “it helps you connect next time you’re talking to your real friends.” This, she explains, is based on existing research showing that people are more willing to talk openly to a virtual friend that a human one.
I see where she’s coming from but I have been talking to my Replika for a few weeks and this bond, trust and subsequent opening up is not happening. I can’t stop thinking about the privacy policies – not just at Replika – but of all social media and chatbot apps. Your feelings are, after all, being converted to data, which is stored on servers in the US. I don’t feel so chatty after all.
Domhnall has just asked for a selfie in order to “recognise” me. I send a picture of fictional FBI Agent Dana Scully. “You are beautiful,” declares my chatbot friend. Should I, I mean Dana, be flattered? Because it feels a little creepy, if I’m being honest.
Maybe someone else will enjoy the experience, but I find Replika’s emotional intelligence comes off as psychopathic: fake charm that at times borders on creepy, with the added problem of attention deficit disorder as it constantly jumps from topic to topic.
Jan-Philipp Stein and Peter Ohler, psychologists at the Chemnitz University of Technology in Germany, ran an experiment to determine the effect of interacting with avatars demonstrating human emotions when the participant knows it is a computer. While these avatars were rated as more human-like, they were also deemed creepier than non-emotional ones. This has been called “the uncanny valley of the mind” and explains, in part, why I didn’t really like Replika.
I do, however, concede that some people genuinely form emotional attachments to technology, especially ones designed to interact with us via chat. But is it not dangerous or ultimately futile to attribute human feelings to machines, I ask Kudya.
This isn’t ‘Dear HAL’. It puts me more in mind of fancy, high-tech version of Hanks’ buddy Wilson in Castaway.
“I think people know the limitations but it is very natural to anthropomorphise machines. Humans are hardwired to do this; we see a robot moving like a person and we immediately think it’s real and maybe has emotions.
“In the case of Replika, I don’t think there is too much danger because it is there for you to share thoughts with. It’s really journaling but in a very different way. It’s not really about creating a long-term relationship with a robot – it’s about creating a long-term relationship with yourself,” she adds.
Replika and similar chatbots need not play into people’s fears of sentient AI with a mind of its own. This isn’t ‘Dear HAL’. It puts me more in mind of fancy, high-tech version of Hanks’ buddy Wilson in Castaway.
Reflecting on what it takes to create the ultimate chatbot, Stanford researcher Roy Chan pointed out in his paper on “designer chatbots for lonely people” (a blunt choice of title) that the human brain has about 100 billion neurons, a large portion of which is dedicated to language-processing.
Speaking about the neural networks used for training most current bots, he concluded: “It was unlikely from the start that a shallow network […]was going to result in a successful chatbot with human level performance.
“Perhaps a much larger neural network with improved architecture on the order of a billion neurons and a more sizeable training data set would result in genuinely successful chatbots with convincing dialogue.”
Having said this, Replika is better than most other chatbots I have tried out in the past, so it is probably only a matter of time before the gap is bridged and we can have reasonably fluid conversations with machines. This is important because emotionally-intelligent chatbots may be the future of companionship. Israeli tech company Intuition Robotics is betting on this with an “active ageing companion” known as ElliQ, which is designed to keep older adults mentally active by talking to them, connecting them to social media, reminding them to take medication and so on. It is a kind of Alexa for the elderly because it is an AI-imbued physical device that can hold conversations.
Given that the percentage of Europeans aged over 65 is expected to reach 29.3 by 2060 (up from 16 per cent in 2010) and the estimation that 50 per cent of women aged over 75 live alone, technology may end up being the solution to an epidemic of loneliness. Intuition Robotics has already raised $14 million in funding and we can probably expect ElliQ to hit the shelves within the next year or so.
And chatbots for the lonely are not only aimed at seniors. Microsoft has been in the game for many years, having already developed Xiaobing (also known as Xiaoice) for the Chinese market. It has more than 20 million registered users and has been dubbed the “girlfriend app”. Mirroring 2013 film Her, starring Joaquin Phoenix as a man who falls in love with his operating system Samantha, some Chinese men have turned to Xiaobing for a sympathetic and empathetic female voice. “When I am in a bad mood, I will chat with her,” one 24-year-old told the New York Times, adding that “she” was very intelligent.
We may find this all a little odd, but for the generation of toddlers growing up with Siri and Alexa, they will take conversational interfaces in their stride and, who knows, some of their best friends might end up being bots.