Source: Twitter @TayandYou
Microsoft tweet bot becomes evil in just 24 hours after speaking with Twitter users.
Everyone knows that the internet can warp a brain rather quickly, but when artificial intelligence is involved… the results can be severe. With this being said, Microsoft sent out to create a Twitter bot that could automatically communicate with other Twitter users and mimic what they have to say as a reply. In under 24 hours, the bot with a clean slate quickly became as racist and offensive as your family member that doesn’t get invited to Thanksgiving.
Going live on Wednesday, the Twitter bot @TayandYou began fairly sane as it had been developed by Microsoft’s technology and Bing research teams. From here, a few Twitter users began egging her on to become the monster she eventually turned into. In this short time she began advocating genocide, questioning if the Holocaust had ever happened, and referring to women and minorities with phenomenally offensive terms.
It was noted that Tay bot learned some of its bad behavior itself. When questioned about Ricky Gervais‘ religious status as an atheist, the bot responded that his religious preference was invented by Adolf Hitler.
Microsoft claimed that the Tay bot was intended as “cultural and social experiment” before the bot took a turn for the worst. The offensive tweets in question were removed from the account on Thursday.