Submit your own Neatorama post and vote for others' posts to earn NeatoPoints that you can redeem for T-shirts, hoodies and more over at the NeatoShop!

Microsoft Deletes Chatbot Over Offensive Tweets

Microsoft launched an artificial intelligence chatbot on Twitter, to see how interacting with other social media users would teach the program conversational skills. The bot was named Tay and had a female profile pic. Tay’s developers sadly neglected to take into consideration what happens when the internet public is encouraged to exert influence.  

But Tay proved a smash hit with racists, trolls, and online troublemakers, who persuaded Tay to blithely use racial slurs, defend white-supremacist propaganda, and even outright call for genocide.

Microsoft has now taken Tay offline for "upgrades," and it is deleting some of the worst tweets — though many still remain. It's important to note that Tay's racism is not a product of Microsoft or of Tay itself. Tay is simply a piece of software that is trying to learn how humans talk in a conversation. Tay doesn't even know it exists, or what racism is. The reason it spouted garbage is that racist humans on Twitter quickly spotted a vulnerability — that Tay didn't understand what it was talking about — and exploited it.

Tay's Twitter feed has since been deleted while the team makes “adjustments,” which presumably will include filters. It’s an example that proves that any public demonstration of technical engineering (as well as PR campaigns) needs to be run through a consult with a devil’s advocate. You can see some examples of the offensive deleted Tweets at Business Insider. -via Digg

Newest 2
Newest 2 Comments

Login to comment.

Email This Post to a Friend
"Microsoft Deletes Chatbot Over Offensive Tweets"

Separate multiple emails with a comma. Limit 5.


Success! Your email has been sent!

close window

This website uses cookies.

This website uses cookies to improve user experience. By using this website you consent to all cookies in accordance with our Privacy Policy.

I agree
Learn More