mostly music

Microsoft’s millennial chatbot tweets racist, misogynistic comments – Trending – CBC News

Tay, a chatbot designed by Microsoft to learn about human conversation from the internet, has learned how make racist and misogynistic comments. Early on, her responses were confrontational and occasionally mean, but rarely delved into outright insults. However, within 24 hours of its launch Tay has denied the Holocaust, endorsed Donald Trump, insulted women and claimed that Hitler was right.

A chatbot is a program meant to mimic human responses and interact with people as a human would. Tay, which targets 18- to 24-year-olds, is attached to an artificial intelligence developed by Microsoft’s Technology and Research team and the Bing search engine team. Microsoft has begun deleting many racist and misogynistic tweets, and has disconnected Tay so they can make a few civility upgrades.

“Within the first 24 hours of coming online, we became aware of a co-ordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways,” a Microsoft spokesperson said in a statement.

Inappropriate is an accurate description of many replies.

Source: Microsoft’s millennial chatbot tweets racist, misogynistic comments – Trending – CBC News

This entry was published on April 4, 2016 at 9:22 am. It’s filed under Article and tagged , , , , , , , . Bookmark the permalink. Follow any comments here with the RSS feed for this post.
%d bloggers like this: