Taylor Swift's lawyers threatened to prosecute Microsoft over the company's Tay chatbot. The Guardian reports that a new book from Microsoft president Brad Smith reveals that attorneys for Taylor Swift were not happy with the company that used the Tay name for its chatbot. Microsoft's chatbot was originally designed for conversations with teens via social media, but Twitter users turned it into a racist chatbot in less than a day.
Smith checked his emails during a vacation and discovered that the Taylor Swift team demanded a name change for the Tay chatbot. "An email has just arrived from a Beverly Hills lawyer who introduced himself by telling me: & # 39; We represent Taylor Swift, on whose behalf this is addressed to you. & # 39;" The lawyers argued that "the use of the Tay name is a false and misleading association between the popular singer and our chatbot, and that it has violated federal and national laws," says Smith in Tools and weapons, a new book about how technology both empowers and threatens us.
It is not exactly clear when Taylor Swift's lawyers contacted Microsoft about the name Tay, but they were probably not happy with the types of misogynistic, racist and Hitler-promoting junk that it published on Twitter. Microsoft quickly apologized for the offensive material posted by his AI bot and pulled the plug after less than 24 hours on Tay.