Sex chatbot a
"It's not the first time Microsoft has created a teen-girl AI.
Xiaoice, who emerged in 2014, was an assistant-type bot, used mainly on the Chinese social networks We Chat and Weibo.
Searching through Tay's tweets (more than 96,000 of them!
Tay, a Microsoft spokeswoman told me, is "as much a social and cultural experiment, as it is technical."The culture seems to have asserted itself."Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay's commenting skills to have Tay respond in inappropriate ways," the spokeswoman said.
After taking Tay offline, Microsoft announced it would be "making adjustments."According to Microsoft, Tay is "as much a social and cultural experiment, as it is technical." But instead of shouldering the blame for Tay's unraveling, Microsoft targeted the users: "we became aware of a coordinated effort by some users to abuse Tay's commenting skills to have Tay respond in inappropriate ways."Yampolskiy said that the problem encountered with Tay "will continue to happen." "Microsoft will try it again—the fun is just beginning!
It took less than 24 hours for Twitter to corrupt an innocent AI chatbot.
If you tell Tay to "repeat after me," it will — allowing anybody to put words in the chatbot's mouth.
However, some of its weirder utterances have come out unprompted.