Cantillon: Microsoft AI bot runs amok on Twitter

AI had gone down an unexpected route, sympathising with Hitler, denying the Holocaust, and using racist and sexist language

While the bot was intended to mimic the interactions of millennials, it seemed that Microsoft’s team had failed to foresee one thing: some people can be awful

While the bot was intended to mimic the interactions of millennials, it seemed that Microsoft’s team had failed to foresee one thing: some people can be awful

 

On the face of it, it must have seemed like a great idea in Microsoft’s labs: build an artificial intelligence (AI) bot and let it learn from the public as it interacted on Twitter, showing machine learning live.

So it created the AI, slapped a female avatar on the @TayandYou Twitter account and turned Tay loose into the world, shiny and new. Only hours later, picking through the wreckage of Tay’s Twitter interactions with people, the team took Tay offline to do a bit more tweaking.

The problem? The AI had gone down an unexpected route, sympathising with Hitler, denying the Holocaust, using racist and sexist language, likening feminism to cancer and generally being a bigot. It was all the worst things about Twitter, condensed and spewing tweets on a public platform. In short, Tay had become a big old mess.

Microsoft’s mistake, it seems, was allowing the public a little too much unfettered access to Tay. While the bot was intended to mimic the interactions of millennials, it seemed that Microsoft’s team had failed to foresee one thing: some people can be awful. That’s no surprise to anyone who has watched the harassment of feminist campaigner Caroline Criado- Perez, Prof Mary Beard or writer Caitlin Moran to name just a few.

The interesting thing is that this isn’t Microsoft’s first shot at an AI chat bot. The company has been running the Chinese Xiaoice chatbot, interacting with 40 million users on Line, Twitter and other platforms. It doesn’t seem to have encountered the same level of hate. In fact, it’s considered a success.

Microsoft’s attempts to clean up Tay’s act hit a snag when the account – briefly back online yesterday – tweeted about taking drugs in front of police, then spammed a couple of hundred thousand Twitter users. Microsoft stepped in again quickly and isolated it from the public.

So what exactly went wrong with Tay? Is it just that Chinese internet users are more polite online? It’s hard to say who was more at fault here.

If Tay was holding a mirror up to western society on Twitter, the reflection is less than flattering.

The Irish Times Logo
Commenting on The Irish Times has changed. To comment you must now be an Irish Times subscriber.
SUBSCRIBE
GO BACK
Error Image
The account details entered are not currently associated with an Irish Times subscription. Please subscribe to sign in to comment.
Comment Sign In

Forgot password?
The Irish Times Logo
Thank you
You should receive instructions for resetting your password. When you have reset your password, you can Sign In.
The Irish Times Logo
Please choose a screen name. This name will appear beside any comments you post. Your screen name should follow the standards set out in our community standards.
Screen Name Selection

Hello

Please choose a screen name. This name will appear beside any comments you post. Your screen name should follow the standards set out in our community standards.

The Irish Times Logo
Commenting on The Irish Times has changed. To comment you must now be an Irish Times subscriber.
SUBSCRIBE
Forgot Password
Please enter your email address so we can send you a link to reset your password.

Sign In

Your Comments
We reserve the right to remove any content at any time from this Community, including without limitation if it violates the Community Standards. We ask that you report content that you in good faith believe violates the above rules by clicking the Flag link next to the offending comment or by filling out this form. New comments are only accepted for 3 days from the date of publication.