4/22/2024 - Technology and Innovation

The day Microsoft had to apologize

By Melissa Bargman

The day Microsoft had to apologize

On March 23, 2016, Microsoft launched Tay on Twitter, a chatbot programmed to interact with users by imitating the way a 19-year-old teenager expresses herself.

Through the account @TayandYou 8 , this bot was programmed in the words of the company "to store and process data from its conversations with human tweeters (...) and perfect its language, skills and millennial attitudes to look more and more like a perfect and real girl".

Don't talk to strangers

However, the ending was not so happy: after 100,000 tweets, 155,000 followers, Tay's account was terminated as her responses had become misogynistic, racist and xenophobic in just 16 hours of life.

"We deeply regret Tay's unintentional offenses and hurtful tweets, which do not represent who we are or what we stand for, nor do they represent the way we designed Tay." So began the statement from Peter Lee, Microsoft's then-corporate vice president.

The tech giant attributed these behaviors to a coordinated attack by users who, through repetitive patterns of behavior in responses, influenced Tay's conversational capabilities (its "vulnerabilities"). No clear explanation could ever be given as to why the bot went so far as to post anti-feminist, sexually charged tweets, treated Barack Obama as a "monkey" and disavowed the Holocaust.

So long, Tay

This machine learning system was about an experiment they had already launched in Chinese social networks called XiaoIce in 2014 (still in use in its sixth generation) that came to interact with 40 million users. From the company, they wanted to clear up a major concern: could such a successful technology in one environment be so captivating in a radically different culture? The answer has become quite clear.

Can the algorithm interpret language in a way that is equivalent to human beings? The language that every society handles goes beyond a simple set of words with definitions that, combined, allow ideas to be expressed. Each person in interaction with others redefines words and expressions, giving them a special connotation depending on the context. A connotation that is not innocent and that can attack, offend or stigmatize.

To this day, Tay has not come back to light even though Microsoft promised to bring her back to life once they were sure they could anticipate "bad intentions that conflict with their principles and values". We will continue to wait for her to learn to mend her ways.



Do you want to validate this article?

By validating, you are certifying that the published information is correct, helping us fight against misinformation.

Validated by 0 users
Melissa Bargman

Melissa Bargman

Hi, I'm Melissa, Social Communicator and Professor of Comparative Legislation at the Faculty of Social Sciences of the University of Buenos Aires. I am dedicated to the research of new technologies and their application in everyday life, their regulation, their opportunities and risks in society.

TwitterLinkedinInstagram

Total Views: 30

Comments