NewsBite

Microsoft artificial intelligence chatbot ‘Tay’ suspended after one day when it started spouting offensive comments

AN artificial-intelligence chatbot designed by Microsoft to tweet like a teenage girl has been suspended after one day because of its offensive, foul-mouthed remarks.

ARTIFICIAL-intelligence software designed by Microsoft to tweet like a teenage girl has been suspended after it began spouting offensive remarks.

Microsoft said it was making adjustments to the Twitter chatbot after users found a way to manipulate it to tweet racist and sexist remarks and made a reference to Hitler.

The chatbot — named Tay — debuted on Wednesday with a couple of perky tweets and the promise:

Tay was designed to learn how to communicate through conversations with real humans, via the Twitter account @TayandYou.

Tay was intended to target and reflect the concerns and voice of young Americans, aged 18 to 24.

Microsoft’s statement on Thursday said that within 24 hours, “we became aware of a co-ordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways.”

The offensive tweets, which have since been deleted, covered a wide range of sexist, racist, and anti-Semitic subjects.

Tay amassed over 70,000 followers before Microsoft pulled the plug.

The Twitterverse was merciless in its appraisal of Microsoft’s artificial intelligence experiment:

Add your comment to this story

To join the conversation, please Don't have an account? Register

Join the conversation, you are commenting as Logout

Original URL: https://www.news.com.au/technology/online/social/microsoft-artificial-intelligence-chatbot-tay-suspended-after-one-day-when-it-started-spouting-offensive-comments/news-story/3e4846c78298cfb652cbe9e936b619f6