'Don't start the car. Your coffee is on the roof'

Advances in artificial intelligence move it closer to ‘proactive personalisation’


You are about to set off for work when your phone buzzes. A notification pops up: “Don’t start the car. Your coffee is on the roof!” Crisis averted. As you reach for your coffee another alert comes in: “Joanna shared another inspirational quote . . . I’ve unfriended her on your behalf. You’re welcome.” You feel a twinge of guilt but it’s for the best.

Welcome to the near distant future where your digital assistant anticipates many of your behaviours and needs.

This is the direction personalisation is headed in, says Prof Séamus Lawless from the Intelligent Systems group in the Trinity College School of Computer Science and Statistics, and member of the Adapt research centre, a collaboration on digital technology research between Trinity, Dublin City University, University College Dublin, and the Dublin Institute of Technology.

Speaking at Accelerating Impact, the recent Adapt event at Croke Park showcasing advances in artificial intelligence and related industry collaborations, Lawless used these tongue-in-cheek examples as he talked about the concept of "proactive personalisation".

READ MORE

Proactive personalisation is a richer, more sophisticated version of the personalised experiences we currently get on Google where search results depend on our unique browsing habits or what we see in our Facebook timeline, which is dictated by kind of things we tend to like, share or linger over.

Intimate knowledge

"I think the most easily understandable example is in the film Her, where the operating system acts on your behalf like a separate person but it is someone that has intimate knowledge of you and an intimate knowledge of the bits of you that you might not be willing to reveal to other people," says Lawless.

This is one of the reasons Joaquin Phoenix's character falls in love with Scarlett Johansson's operating system in Her. This Siri on steroids knows the hidden you (you can't hide your browser history from her) and the public you. In theory, no one will ever know you better and thanks to her AI, she uses this knowledge to anticipate your wants and needs. She gets you like no one gets you.

Before Lawless goes into the technicalities of such personalised AI I wonder how ethical it is to create a virtual agent that we could become very attached to, given the accelerating sophistication of conversational interfaces?

There is no need to worry about being tricked into falling in love with Alexa just yet, he says, explaining that we’re not at the stage where an AI can create a natural conversation; right now, these conversations are mimicking as best they can but are quite robotic in nature.

As AI develops, who knows? "It's no longer a machine when people anthropomorphise it. It raises questions of whether it will be okay to become friends with your AI or as in Her, will it be okay to date your AI? There is no doubt that there could be a strong emotional bond between a person and a virtual agent."

But proactive personalisation is not merely about an upgrade to Google Assistant, because it translates to offline actions too. It is the happy marriage of automated tasks and clever data insights gleaned from your behaviours.

“I travel to the US a lot for work and I always try to take in a sports event in each city I visit but it’s a surprisingly time-consuming task: you have to figure out what sports are in season, if a team is playing at home, what day and time, and if there are tickets available, how much they cost,” says Lawless.

“All of this is completely predictable and really easily automated. If an agent knew to do that on my behalf, literally the moment I arrived in Austin, Texas, I could get a message popping up to say: ‘There’s a University of Texas game on tonight, tickets cost $80, do you want to go?’ and buy them on my behalf.”

The future of personalisation is about the automation of those mundane everyday tasks that expend our time, energy and attention.

“We’re not trying to automate your life entirely. We’re trying to automate the commonplace, repeatable or predictable aspects of your life,” says Lawless.

But isn't personalisation placing us in our own little online echo chambers as outlined by Eli Pariser in his 2011 book The Filter Bubble: What the Internet is Hiding from You. And will this not be exacerbated as algorithms improve?

“The days when the filter bubble disappears when we step away from our computers . . . are numbered,” says Pariser.

This filter bubble means we see more of the same, whether we like it or not. Just because we spend the evening binge watching Cats versus Christmas Trees (brilliant, by the way) on YouTube doesn't mean we want recommendations for more of the same.

We fallible humans vow to do better and watch an educational documentary tomorrow but if the internet keeps spoon feeding us comforting pap it’s hard to say no. How can we reach towards the better angels of our nature with algorithms doing a Mrs Doyle on us?

"Given your past behaviour the algorithm thinks you just want to look at videos of cats on YouTube but in reality you actually want to read about the political situation in Burma. The aspiration is to consume meaningful content but that clickbait article about Meghan Markle and Prince Harry looks enticing so you click on it and you're reinforcing the pattern.

“That’s your aspirational self versus your hated, actual self that you keep sliding into,” says Lawless. These aren’t tangible data points that Google or Facebook can take into account.

Lest you think algorithms mean the death of chance discovery and no hope of being nudged towards better behaviour, there is such a concept as technological serendipity.

“That’s right at the frontier of personalisation research and it’s what we’re trying to achieve. That balance of personalisation and diversification while building in aspiration and serendipity.”

Typically, with the algorithms we are used to, you lose that serendipity. You lose the happy accident where you discover something on the web that’s completely outside of your area of interest. Those who still buy physical newspapers know this; once purchased, most of us tend to flick randomly through the pages and chance upon an article or two that might challenge our worldview or introduce us to something very different.

Even the ghost of Facebook past was mildly serendipitous when the timeline was temporal and everything was displayed in sequence, or Twitter of two years ago when it used to serve up tweets as they happened.

‘Human in the loop’

Now black-box algorithms silently deliver personalised content: “They are making those decisions in a manner that they believe is on your behalf but in reality you have no insight into what those decision-making processes are and why those decisions are being made,” says Lawless as he describes the concept of “human in the loop”.

“I think it is really crucial that AI isn’t just making decisions on our behalf without providing any human insight or without humans being able to influence that process in any way.”

The problem with mega-algorithms such as Google search is that, as they have grown, no one person seems to fully understand how they work. According to Pariser, this is comprised of hundreds of thousands of lines of code and by his account, one Google employee he spoke to explained that “the [search] team tweaks and tunes, they don’t really know what works or why it works, they just look at the result”.

So why is the user experience on the web heading in this direction of personalisation when it reinforces our filter bubble in such an opaque manner? In a way, it is a necessary evil, says Lawless: “There is such a volume of content that we cannot consume it all. We need some way of filtering but I think these filters must balance those four axes I mentioned: personalisation, diversification, aspiration and serendipity.”

It’s a difficult and by no means solved challenge, he says: “That’s what we’re working on now.”