San Francisco Bay Area Indian Community -
| | | | | | | | | | | |


Taylor Swift once threatened to sue Microsoft over its bot


Author : Indo Asian News Service

Science/Tech, National, Delhi, India, Technology Read Latest News and Articles

Share With Your Friends

Add To My Favorite     Add A Picture

Add an Article

View All Contributions

New Delhi, Sep 11 (IANS) Inspired by the success of its Chinese female AI-based social chatbot "XiaoIce," Microsoft planned its American avatar in the name of 'Tay' that immediately ran into a controversy with singer Taylor Swift.

Revealing the controversy for the first time, Microsoft President, in his new book titled "Tools and Weapons: The Promise and the Peril of the Digital Age," said that it all went wrong with Tay, including its name.

"The world's different taste in technology was revealed when we brought 'XiaoIce' to the US in the spring of 2016. We launched her to the US market with new name 'Tay'. The new name turned out to be just the start of our problems with the American debut of XiaoIce," Smith wrote.

On a vacation, he received an email: "We represent Taylor Swift, on whose behalf this is directed to you."

The Beverly Hills lawyer representing Taylor went on to state that "the name 'Tay', as I'm sure you must know, is closely associated with our client."

The lawyer argued that the use of the name Tay "created a false and misleading association between the popular singer and our chatbot, and that it violated federal and state laws."

Smith said that Microsoft lawyers took a different view, but "we had not sought to pick a fight with or even offend Taylor Swith," as the company grappled with larger issues with Tay.

The AI chatbot Tay ran into trouble when Twitterati began slamming the "innocent" bot with racist and offensive comments. Launched as an experiment to engage people through "casual and playful conversation", Tay was soon taken off Twitter.

"In little more than a day, we had to withdraw Tay from the market, providing a lesson not just about cross-cultural norms, but also about the need for stronger AI safeguards," said Smith.

Microsoft even apologised in a blog post for Tay's "unintended offensive and hurtful tweets."



Premium Advertiser
Psychic, Spiritual Healing - Astrologer & Spiritual Healer

This is an advertisement

Latest Showbiz News

View More News

More News Articles

Sonam Kapoor: I like to play characters that represent real people

Bumrah has skill-set to succeed in India in Tests: Agarkar

Horror director Shyam Ramsay no more

Paine played with broken thumb in 5th Ashes Test

Katrina Kaif lookalike takes social media by storm