Cantillon: Microsoft AI bot runs amok on Twitter
AI had gone down an unexpected route, sympathising with Hitler, denying the Holocaust, and using racist and sexist language
While the bot was intended to mimic the interactions of millennials, it seemed that Microsoft’s team had failed to foresee one thing: some people can be awful
On the face of it, it must have seemed like a great idea in Microsoft’s labs: build an artificial intelligence (AI) bot and let it learn from the public as it interacted on Twitter, showing machine learning live.
So it created the AI, slapped a female avatar on the @TayandYou Twitter account and turned Tay loose into the world, shiny and new. Only hours later, picking through the wreckage of Tay’s Twitter interactions with people, the team took Tay offline to do a bit more tweaking.
The problem? The AI had gone down an unexpected route, sympathising with Hitler, denying the Holocaust, using racist and sexist language, likening feminism to cancer and generally being a bigot. It was all the worst things about Twitter, condensed and spewing tweets on a public platform. In short, Tay had become a big old mess.
Microsoft’s mistake, it seems, was allowing the public a little too much unfettered access to Tay. While the bot was intended to mimic the interactions of millennials, it seemed that Microsoft’s team had failed to foresee one thing: some people can be awful. That’s no surprise to anyone who has watched the harassment of feminist campaigner Caroline Criado- Perez, Prof Mary Beard or writer Caitlin Moran to name just a few.
The interesting thing is that this isn’t Microsoft’s first shot at an AI chat bot. The company has been running the Chinese Xiaoice chatbot, interacting with 40 million users on Line, Twitter and other platforms. It doesn’t seem to have encountered the same level of hate. In fact, it’s considered a success.
Microsoft’s attempts to clean up Tay’s act hit a snag when the account – briefly back online yesterday – tweeted about taking drugs in front of police, then spammed a couple of hundred thousand Twitter users. Microsoft stepped in again quickly and isolated it from the public.
So what exactly went wrong with Tay? Is it just that Chinese internet users are more polite online? It’s hard to say who was more at fault here.
If Tay was holding a mirror up to western society on Twitter, the reflection is less than flattering.