For Patrizia Schlosser, it started with an apologetic call from a colleague. “I’m sorry but I found this. Are you aware of it?” He sent over a link, which took her to a site called Mr DeepFakes. There, she found fake images of herself, naked, squatting, chained, performing sex acts with various animals. They were tagged “Patrizia Schlosser sluty FUNK whore” (sic).
“They were very graphic, very humiliating,” says Schlosser, a German journalist for Norddeutscher Rundfunk (NDR) and Funk. “They were also very badly done, which made it easier to distance myself, and tell myself they were obviously fake. But it was very disturbing to imagine somebody somewhere spending hours on the internet searching for pictures of me, putting all this together.”
The site was new to Schlosser, despite her previous high-profile investigations into the porn industry. “I’d never heard of Mr DeepFakes – a porn site entirely dedicated to fake porn videos and photos. I was surprised by how big it was – so many videos of every celebrity you know.” Schlosser’s first reaction upon seeing herself among them was to brush it aside. “I tried to push it to the back of my mind, which was really a strategy of not dealing with it,” she says. “But it’s strange how the brain works. You know it’s fake but still you see it. It’s not you but also it is you. There you are with a dog and a chain. You feel violated but confused. At some point, I decided: ‘No. I’m angry. I don’t want those images out there.’”
Schlosser’s subsequent documentary for NDR’s STRG_F program did succeed in getting the images removed. She also tracked down the young man who had created and posted them – even visiting his home and speaking to his mother. (The perpetrator himself wouldn’t come out of his bedroom.) However, Schlosser was unable to identify “Mr DeepFakes” – or whoever was behind the site, despite enlisting the help of Bellingcat, the online investigative journalism collective. Bellingcat’s Ross Higgins was on the team. “My background is investigating money laundering,” he says. “I looked at the structure of the website and it was using the same internet service providers (ISPs) as proper serious organized criminals.” The ISPs suggested links to the Russian mercenary group Wagner, and individuals named in the Panama Papers. The ads it carried included ones for apps owned by Chinese technology companies, which allowed China’s government access to all customer data. “I made the assumption that this was all much too sophisticated to be a site of hobbyists,” says Higgins.
It turned out that’s exactly what it was.
The story of Mr DeepFakes, the world’s largest, most notorious nonconsensual deepfake porn site, is really the story of AI porn itself – the very term “deepfake” is believed to have come from its originator. A “ground zero” for AI-generated pornography, its pages – which have been viewed more than 2bn times – have depicted countless female celebrities, politicians, European princesses, wives and daughters of US presidents, being kidnapped, tortured, shaved, bound, mutilated, raped and strangled. Yet all this content (which would take more than 200 days to watch) was just the site’s “shop window”. Its true heart, its “engine room”, was its forum. Here, anyone wanting deepfakes created of someone they knew (a girlfriend, sister, classmate or colleague) could find someone willing to make them to order for the right price. It was also a “training ground”, a technical hub where “hobbyists” taught one another, shared tips, posted academic papers and “problem-solved”. (One recurring problem was how to deepfake without a good “dataset”. This means when you’re trying to deepfake someone you don’t have many pictures of – so not a celebrity, but maybe someone you know whose social media you’ve screengrabbed.)
The film-maker and activist Sophie Compton spent many hours monitoring Mr DeepFakes while researching the award-winning 2023 documentary Another Body (available on iPlayer). “Looking back, I think that site played such an instrumental role in the proliferation of deepfakes overall,” she says. “I really think that there’s a world in which the site didn’t get made, wasn’t allowed to be made or was shut down quickly, and deepfake porn is just a fraction of the issue that we have today. Without that site, I don’t think it would have exploded in the way it did.”
In fact, that scenario was entirely possible. The origins of Mr. Deepfakes stretch back to 2017-18 when AI porn was just beginning to build on social media sites such as Reddit. One anonymous Redditor and AI porn “pioneer” who went by the name of “deepfakes” (and is thus credited with coining the term) gave an early interview to Vice about its potential. Shortly after, though, in early 2018, Reddit banned deepfake porn from its site. “We have screenshots from their message boards at that time and the deepfake community, which was small, was freaking out and jumping ship,” says Compton. This is when Mr DeepFakes was created, with the early domain name dpfks.com. The administrator carried the same username – dpfks – and was the person who advertised for volunteers to work as moderators, and posted rules and guidelines, as well as deepfake videos and an in-depth guide to using software for deepfake porn.
“What’s so depressing about reading the messages and seeing the genesis is realizing how easily governments could have stopped this in its tracks,” says Compton. “The people doing it didn’t believe they were going to be allowed free rein. They were saying: ‘They’re coming for us!’, ‘They’re never going to let us do this!’ But as they continued without any problems at all, you see this growing emboldenment. Covid added to the explosion as everyone stopped moderating content. The output was violent – it was about degrading someone completely. The celebrities that were really popular were often really young – Emma Watson, Billie Eilish, Millie Bobby Brown.” (Greta Thunberg is another example here.)
Who was behind it? From time to time, Mr DeepFakes gave anonymous interviews. In a 2022 BBC documentary, Deepfake Porn: Could You Be Next?, the site’s “owner” and “web developer”, going by the pseudonym “deepfakes”, made the argument that consent from the women wasn’t required as “it’s a fantasy, it’s not real”.
Was money their motivation? Mr DeepFakes ran ads and had a premium membership paid in cryptocurrency – in 2020, one forum mentions that it made between $4,000 and $7,000 a month. “There was a commercial aspect,” says Higgins. “It was a side hustle, but it was more than that. It gave this notoriety.”
At one point, the site “posted 6,000 pictures of AOC’s (the US politician Alexandria Ocasio-Cortez’s) face in order that people could make deepfake pornography of her,” says Higgins. “It’s insane. (There were) all these files of YouTubers and politicians. What it’s saying is that if you’re a woman in this world you can only achieve so much because if you put your head above the parapet, if you have the temerity to do anything publicly, you can expect your image to be used in the most degrading way possible for personal profit.
“The most affecting thing for me was the language used about women on that site,” he continues. “We had to change it for our online report because we didn’t want it to be triggering, but this is pure misogyny. Pure hatred.”
This April, investigators began to believe that they had found Mr DeepFakes and sent emails to their suspect.
On 4 May, Mr DeepFakes shut down. A notice on its homepage blamed “data loss” caused by the withdrawal of a “critical service provider”. “We will not be relaunching,” it continued. “Any website claiming this is fake. This domain will eventually expire and we are not responsible for future use. This message will be removed in about a week.”
Mr DeepFakes is finished – but according to Compton, this could have happened so much sooner. “All the signs were there,” she says. The previous year, in April 2024, when the UK government announced plans to criminalize the creation and sharing of deepfake sexual abuse material, Mr DeepFakes responded by immediately blocking access to UK users. (The plans were later shelved when the 2024 election was called.) “It showed that ‘Mr DeepFakes’ was obviously not so committed that there was nothing governments could do,” says Compton. “If it was going to become too much of a pain and a risk to run the site, then they weren’t going to bother.”
But deepfake porn has become so popular, so mainstream, that it no longer requires a “base camp”. “The things that those guys prided themselves on learning how to do and teaching others are now so embedded, they’re accessible to anyone on apps at the click of the button,” says Compton.
And for those wanting something more complex, the creators, the self-styled experts who once lurked on its forum, are now out there touting for business. Patrizia Schlosser knows this for sure. “As part of my research, I went undercover and reached out to some of the people on the forums, asking for a deepfake of an ex-girlfriend,” says Schlosser. “Although it’s often claimed the site was only about celebrities, that wasn’t true. The response was, ‘Yeah, sure …’
“After Mr DeepFakes shut down, I got an automatic email from one of them which said: “If you want anything made, let me know … Mr DeepFakes is down – but of course, we keep working.”
In the UK and Ireland, Samaritans can be contacted on freephone 116 123, or email jo@samaritans.org or jo@samaritans.ie. In the US, you can call or text the 988 Suicide & Crisis Lifeline at 988 or chat at 988lifeline.org. In Australia, the crisis support service Lifeline is 13 11 14. Other international helplines can be found at befrienders.org
In the UK, Rape Crisis offers support for rape and sexual abuse on 0808 802 9999 in England and Wales, 0808 801 0302 in Scotland, or 0800 0246 991 in Northern Ireland. In the US, Rainn offers support on 800-656-4673. In Australia, support is available at 1800Respect (1800 737 732). Other international helplines can be found at ibiblio.org/rcip/internl.html
Quick Guide
Contact us about this story
show
The best public interest journalism relies on first-hand accounts from people in the know.
If you have something to share on this subject, you can contact us confidentially using the following methods.
Secure Messaging in the Guardian app
app has a tool to send tips about stories. Messages are end to end encrypted and concealed within the routine activity that every Guardian mobile app performs. This prevents an observer from knowing that you are communicating with us at all, let alone what is being said.
If you don’t already have the Guardian app, download it (iOS/Android) and go to the menu. Select ‘Secure Messaging’.
SecureDrop, instant messengers, email, telephone and post
If you can safely use the Tor network without being observed or monitored, you can send messages and documents to the Guardian via our SecureDrop platform.
Finally, our guide at theguardian.com/tips lists several ways to contact us securely, and discusses the pros and cons of each.
