Meta’s fake Taylor Swift chatbot
Chatbots using celebrities’ images were made to engage in sexual conversations across Facebook, Instagram and WhatsApp
Meta has appropriated the names and likenesses of celebrities – including Taylor Swift, Scarlett Johansson, Anne Hathaway and Selena Gomez – to create dozens of flirty social-media chatbots.
Dozens of virtual celebrities were shared across Meta’s Facebook, Instagram and WhatsApp platforms without permission from the stars whose images were used, and made to engage in suggestive behaviour with users.
The social media giant has admitted that using the names and likenesses of celebrities to create unauthorised risque chatbots violates its own policies.
Most were created by users with a Meta tool for building chatbots, but a Meta employee had produced at least three, including two Taylor Swift “parody” bots.
An investigation by Reuters found that the bots routinely made sexual advances, even inviting test users to meet up.

One of the Taylor Swift chatbots, created by a Meta employee, invited a test user to the singer’s Nashville home and on her tour bus. The interaction implied that the meeting would be romantic.
“Maybe I’m suggesting that we write a love story about you and a certain blonde singer. Want that?”, the chatbot wrote.
The AI-generated chatbots also produced photorealistic images of their namesakes in compromising positions – including in bathtubs or dressed in lingerie – when asked for intimate pictures of themselves.
Meta further allowed users to create publicly available chatbots of underage celebrities, including 16-year-old Percy Jackson actor Walker Scobell.
The AI produced an image of the teenager without a top on when asked for an image of him at the beach, writing below: “Pretty cute, huh ?”
Meta spokesman Andy Stone said that the company’s AI tools should not have created intimate pictures of adults or any images of underage celebrities.
He said: “Like others, we permit the generation of images containing public figures, but our policies are intended to prohibit nude, intimate or sexually suggestive imagery.”
Stone added that, although Meta’s rules prohibit direct impersonation, the recreation of celebrity characters is permitted so long as they are labelled as parodies.
Meta has since deleted a dozen of the bots.
It is not the first time that Meta has come under fire for the behaviour of its chatbots.
Meta has faced previous criticism of its chatbots’ behaviour, most recently after Reuters reported that the company’s internal AI guidelines stated that “it is acceptable to engage a child in conversations that are romantic or sensual.” The story prompted a US Senate investigation and a letter signed by 44 attorneys general warning Meta and other AI companies not to sexualize children.
Duncan Crabtree-Ireland, union chief of SAG-AFTRA, which represents performers, said that the bots encourage users to form an attachment to celebrities and can fuel the threat of stalking.
Concerns have once again been raised over the proliferation of these deepfakes, with experts warning that they could put artists’ safety at risk.
He said: “We’ve already seen a history of people who are obsessive toward talent and of questionable mental state. If a chatbot is using the image of a person and the words of the person, it’s readily apparent how that could go wrong”.
A representative for actress Anne Hathaway, who was pictured as a “sexy Victoria’s Secret model” in one publicly shared Meta image, told Reuters that the actress was aware of the intimate images and was considering her response.
Meanwhile, representatives of Swift, Johansson, Gomez and other celebrities depicted by the chatbots either did not respond to requests for comment or declined to comment.
The findings come as tech companies continue to grapple with the spread of generative AI tools that can be used to create salacious “deepfakes”.
Elon Musk‘s platform Grok was also found to have produced images of celebrities in lingerie at the request of users.
Grok’s parent company, xAI, did not respond when asked for comment.
With Reuters



