Some time ago, Microsoft introduced us to a virtual robot named Tay. Tay was supposed to appeal to teens and young adults by speaking their language and culture. Her Twitter account is @Tayandyou, and she’s supposed to get smarter with every tweet. But Tay is learning and spewing, racism, hatred and conspiracy theories as well.
In some well publicized tweets, Tay referred to President Obama as ‘a monkey’. Tay also tweeted former President George W Bush was responsible for 9/11. Tay want on to deny the Holocaust. Tay even went as far to praise Hitler and side with white supremacist groups. This is just the tip of the hate berg. Some tweets she received and comments she dished out were so vile and hurtful they had to be deleted. Microsoft even had to take her offline to be fixed for upgrades. ?Tray was able to share such filth because Microsoft forgot to put any filters on her. A Microsoft statement summed this failed experiment up accurately when they said, “The Al chatbox Tay ?is a machine learning project, designed for human engagement. As it learns, some of it’s responses?are inappropriate and indicative of the types of interactions some people are having with it.” Tay’s sharp tongue has caused much concern over AI intelligence altogether.
What Tay did was expose what in in many tweeter’s hearts. Tay picked up all this hatred. That’s what she gave back. For years, people have been tweeting ugliness most wouldn’t dare say to someone’s face. It’s easy to give yourself a nickname and hide behind a screen and bully and curse someone. I am usually weary of robot technology. But this is an experiment that can teach us all something. What goes in is going to come out. We shouldn’t be surprised at Tay’s hateful views if that’s all she’s getting. And if this is what today’s teens and early twenty-somethings are teaching her, what does this say about this upcoming generation? Does Tay reveal what’s in our hearts?