Microsoft's chat bot is yanked offline after Twitter users warp it with racism

Tay

Update: A Microsoft spokesperson has provided us with the following statement, responding to speculation as to why Tay was taken down:

"The AI chatbot Tay is a machine learning project, designed for human engagement. It is as much a social and cultural experiment, as it is technical. Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay's commenting skills to have Tay respond in inappropriate ways. As a result, we have taken Tay offline and are making adjustments."

Bad day for Tay

And then, early this morning after only 16 hours of uptime, Tay was taken offline from Twitter, announcing with a tweet: "C u soon humans need sleep now so many conversations today thx".

Naturally enough, Microsoft seems to have deleted the vast majority of Tay's tweets, which contained racist or other negative content, and says to be "making adjustments." Likely, those adjustments will involve cleaning up the way she works with regards to repeating such statements.

It's kind of a shame, as from our conversations yesterday, Tay came up with some interesting and in some cases amusing responses. Such as…

Tay reply

Well said, Tay, in that case. Well said.

Darren is a freelancer writing news and features for TechRadar (and occasionally T3) across a broad range of computing topics including CPUs, GPUs, various other hardware, VPNs, antivirus and more. He has written about tech for the best part of three decades, and writes books in his spare time (his debut novel - 'I Know What You Did Last Supper' - was published by Hachette UK in 2013).