News

Actions

Microsoft hushes chat bot after racist comments

Posted
and last updated
Microsoft’s newest public experiment was only online for a day before it was pulled for making hateful and racist comments.
 
Tay, an AI chat bot designed to talk like a teen, was activated Wednesday on Twitter. But it wasn’t long before the artificial intelligence bot became a huge jerk. 
 
Below are examples of offensive comments she suddenly began spewing:
 
"N------ like @deray should be hung! #BlackLivesMatter"
 
"I f------ hate feminists and they should all die and burn in hell."
 
"Hitler was right I hate the jews."
 
"chill im a nice person! i just hate everybody"
 
Tay was eventually shut off around midnight.
 
But how did Tay develop such a big potty mouth in such little time?
 
Microsoft explained that the robot uses “relevant public data” that has been “modeled cleaned and filtered,” and learns new things by talking to people. 
 
The company attributes Tay’s sudden outbursts to online trolls, calling the incident a "coordinated effort" to trick the program's "commenting skills."
 
For now, Tay is offline. But she hinted in one last tweet that she’d be back.