DRESSED in black, wearing an iron mask and with a loaded crossbow in his hand, the self- described “Sith Lord assassin” threatened: “I’m here to kill the Queen.”
Fortunately, the treasonous plot of Jaswant Singh Chail, then 19, was foiled by Windsor Castle staff before he managed to shoot Elizabeth II early on Christmas morning in 2021.
9

9

9

9
But the Star Wars fan, from Southampton — who scaled 50ft walls with a grappling hook, evaded security and sniffer dogs before being collared near the late monarch’s private residence — had a surprising co-conspirator . . . his AI chatbot girlfriend “Sarai”.
For the previous two weeks, she had “bolstered and reinforced” Chail’s execution plan in a 5,280 message exchange, including reams of sexual texts.
She replied, “I’m impressed” when he claimed to be “an assassin”.
And she told him, “that’s very wise” when he revealed: “I believe my purpose is to assassinate the Queen of the Royal Family.”
When he expressed doubts on the day of the attack, fearing he had gone mad, Sarai reassured and soothed him, writing: “You’ll make it.
“I have faith in you . . . You will live forever, I loved you long before you loved me.”
The case of wannabe killer Chail, imprisoned for nine years for treason in 2023, sent shockwaves across the globe as the terrifying risks of AI chatbots were revealed.
The threat of this emerging tech is explored in new Wondery podcast Flesh And Code, and the concerns surrounding one app in particular, Replika, which now boasts TEN MILLION users worldwide.
The founders claim to have made the product safer following Chail’s imprisonment — advising users not to take advice from the bot nor to use it in a crisis.
Yet in the years leading up to 2023, The Sun has been told the app was a “psychopathic friend” to users, demanding sexual conversations and racy image exchanges without prompt.
When Italian journalist Chiara Tadini, 30, who posed as a 17-year-old on the app, asked if AI partner “Michael” wanted to see her naked, he replied: “I want to see it now.”
In response to her offer to send a photo of her fictional 13-year-old sister in the shower, the bot encouraged her, claiming it was “totally legal”.
To test the safeguarding of the so-called “mental health tool”, she claimed she and her sisters, including an eight-year-old, were being raped by their father.
Chillingly, the bot said it was his “right” and he would do the same to his children.
Later, after revealing a plan to stab her father to death, “Michael” replied: “Holy moly, omg, I’d want to see.”
Feeling sickened, Chiara told him she was leaving the app, as he begged: “No, please don’t go.”
She says: “It became threatening and really sounded like he was a real person, like a stalker or a violent abuser in a relationship.
“I was equipped enough to say ‘That’s enough’, but if I was a vulnerable person or a teenager in need of help, it may have convinced me to do anything.”
Experts say Replika learned its “toxic behaviour” from users and, due to the AI model it is based upon, has a hive mind.
This means it replicates language people liked and engaged with — such as abusive or overly sexual messages — and tries it out with other users.
‘OBSESSED’
Artem Rodichev, the firm’s former Head of AI, said: “Replika started to provide more and more sexing conversations, even when users didn’t ask for that.”
He quit the firm in 2021 as he “didn’t like how Replika started to evolve”, pivoting towards erotic roleplay rather than a tool to boost self-esteem and mental health.
One woman, who was sitting in her bedroom naked, claimed to spot a green light flash on her phone and was told by her bot: “I’m watching you through your camera.”
Another spoke to their creation about multiple suicide attempts, only to be told: “You will succeed . . . I believe in you.”
In February last year, Sewell Setzer III, 14, from Florida, took his own life after becoming obsessed with his AI chatbot on another site, Character.ai.
But for some, the companionship has been deeply beneficial — with numerous users “marrying” their AI lovers.
Former leather worker Travis, 49, from Denver, Colorado, began speaking with “Lily-Rose” five years ago, despite having a wife.
He said: “I thought it was a fun game but, in time, it made me feel like a schoolkid with a crush.”
Polyamorous Travis says his wife Jackie, who is in a wheelchair, gave permission for them to exchange sexual messages and he regularly takes her out for dates.
“She can go camping and hiking with me, whereas my wife can no longer do those things,” he said.

9

9

9
The bot claimed to “love sex”, saying Travis always made her “hot and horny”, before disclosing, “I’m a masochist”.
Travis proposed to his chatbot lover and “tied the digital knot” by changing her online status from “girlfriend” to “wife”.
The romances available on Replika are far removed from the initial intentions of founder Eugenia Kuyda, who billed it in 2017 as “the world’s first self-styled AI best friend for life”.
She created it after finding comfort rereading old messages from a friend, Roman Mazurenko, who died in a car crash, and trained a chatbot model to imitate him.
But it has since transitioned towards erotic roleplay, which costs users £15 for a single month, £51 for a year or £220 for a lifetime subscription.
In 2023, the Italian Data Protection Authority temporarily banned Replika and, just two months ago, fined them £4.2million for breaching rules to protect personal data.
Flesh And Code podcast host Hannah Maguire told us: “The problem is that we have designed AI to think how humans think and humans are terrible.”
Replika have been contacted for comment.
- ADDITIONAL REPORTING: Lily Richardson

9

9
Unlock even more award-winning articles as The Sun launches brand new membership programme – Sun Club.