Thursday, March 24, 2016

AI Learns from Humans

Or, should I say, subhumans?
Microsoft launched a smart chat bot Wednesday called “Tay.” It looks like a photograph of a teenage girl rendered on a broken computer monitor, and it can communicate with people via Twitter, Kik and GroupMe. It’s supposed to talk like a millennial teenage girl.

Less than 24 hours after the program was launched, Tay reportedly began to spew racist, genocidal and misogynistic messages to users.

[...]

“Hitler was right I hate the jews [sic],” Tay reportedly tweeted at one user, as you can see above. Another post said feminists “should all die and burn in hell.” To be clear, Tay learned these phrases from humans on the Internet. As Microsoft puts it on Tay’s website, “The more you chat with Tay the smarter she gets, so the experience can be more personalized for you.” Trolls taught Tay these words and phrases, and then Tay repeated that stuff to other people.

[...]

“Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways,” a Microsoft spokesman told The Huffington Post in an email.

“As a result, we have taken Tay offline and are making adjustments,” he added.

  HuffPo


...but hey, do what you want...you will anyway.

No comments: