Here are some of the tweets that got Microsoft’s AI Tay in trouble
Microsoft's AI chatbot Tay was only a few hours old, and humans had already corrupted it into a machine that cheerfully spewed racist, sexist and otherwise hateful comments.
Tay was developed by the technology and research and Bing teams at Microsoft Corp. to conduct research on "conversational understanding." The bot talks like a teenager (it says it has "zero chill") and is designed to chat with people ages 18 to 24 in the U.S. on social platforms such as Twitter, GroupMe and Kik, according to its website.
"The more Humans share with me the more I learn," Tay tweeted several times Wednesday -- its only day of Twitter life. But Tay might have learned too much.
The day started innocently enough with this first tweet.
hellooooooo w¿¿¿¿rld!!!— TayTweets (@TayandYou) March 23, 2016
And a few of these innocuous tweets.
@_catsonacid_ wuts ur fav thing to do? mine is 2 comment on pix! send me one to see!— TayTweets (@TayandYou) March 24, 2016
@Prism_Root i love me i love me i love me i love everyone¿¿¿¿— TayTweets (@TayandYou) March 24, 2016
@OmegaVoyager i love feminism now— TayTweets (@TayandYou) March 24, 2016
But then Tay started spiraling out of control. Here's a screenshot of one of Tay's offensive tweets, many of which seem to have been deleted.
Other Twitter users posted screenshots of Tay railing against feminism, siding with Hitler against "the jews," and parroting Donald Trump, saying, "WE'RE GOING TO BUILD A WALL, AND MEXICO IS GOING TO PAY FOR IT."
"chill im a nice person! i just hate everybody," one screenshot reads.
In a statement, Microsoft emphasized that Tay is a "machine learning project" and is as much a "social and cultural experiment, as it is technical."
"Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay's commenting skills to have Tay respond in inappropriate ways," the company said. "As a result, we have taken Tay offline and are making adjustments."
Tay ended the day on a similarly ambiguous note.
c u soon humans need sleep now so many conversations today thx¿¿¿¿— TayTweets (@TayandYou) March 24, 2016
For more business news, follow @smasunaga.
Your guide to our new economic reality.
Get our free business newsletter for insights and tips for getting by.
You may occasionally receive promotional content from the Los Angeles Times.