
Tay (chatbot) - Wikipedia
Tay was a chatbot that was originally released by Microsoft Corporation as a Twitter bot on March 23, 2016. It caused subsequent controversy when the bot began to post inflammatory and offensive tweets through its Twitter account, causing Microsoft to shut down the service only 16 hours after its launch. [1] .
Microsoft shuts down AI chatbot after it turned into a Nazi
2016年3月24日 · Today, Microsoft had to shut Tay down because the bot started spewing a series of lewd and racist tweets. Tay was set up with a young, female persona that Microsoft's AI programmers apparently...
Tay: Microsoft issues apology over racist chatbot fiasco - BBC
2016年3月25日 · Microsoft has apologised for creating an artificially intelligent chatbot that quickly turned into a holocaust-denying racist. But in doing so made it clear Tay's views were a result …
Tay AI - Know Your Meme
Microsoft Tay was an artificial intelligence program that ran a mostly Twitter -based bot, parsing what was Tweeted at it and responding in kind. Tay was meant to be targeted towards people ages 15-24, to better understand their methods of communication.
Why Microsoft's 'Tay' AI bot went wrong | TechRepublic
2016年3月24日 · Less than a day after she joined Twitter, Microsoft's AI bot, Tay.ai, was taken down for becoming a sexist, racist monster. AI experts explain why it went terribly wrong.
Here are some of the tweets that got Microsoft’s AI Tay in trouble
2016年3月25日 · Microsoft's AI chatbot Tay was only a few hours old, and humans had already corrupted it into a machine that cheerfully spewed racist, sexist and otherwise hateful comments. Here are some of...
Twitter taught Microsoft’s AI chatbot to be a racist asshole in …
2016年3月24日 · Yesterday, Microsoft unveiled Tay — a Twitter bot that the company described as an experiment in “conversational understanding.” The more you chat with Tay, said Microsoft, the smarter it gets,...
In 2016, Microsoft’s Racist Chatbot Revealed the Dangers of …
2019年11月25日 · Microsoft's Tay chatbot started out as a cool teenage girl, but quickly turned into a hate-speech-spewing disaster.
What Happened to Microsoft's Tay AI Chatbot? - DailyWireless
2020年3月7日 · Tay, which is an acronym for “Thinking About You”, is Microsoft Corporation’s “teen” artificial intelligence chatterbot that’s designed to learn and interact with people on its own. Originally, it was designed to mimic the language pattern of a 19-year-old American girl before it was released via Twitter on March 23, 2016.
Learning from Tay’s introduction - The Official Microsoft Blog
2016年3月25日 · The great experience with XiaoIce led us to wonder: Would an AI like this be just as captivating in a radically different cultural environment? Tay – a chatbot created for 18- to 24- year-olds in the U.S. for entertainment purposes – is our first attempt to answer this question.
- 某些结果已被删除