By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: Porn, dog poo and social media snaps: the ‘taskers’ scraping the internet for Meta-owned AI firm
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > News > Porn, dog poo and social media snaps: the ‘taskers’ scraping the internet for Meta-owned AI firm
News

Porn, dog poo and social media snaps: the ‘taskers’ scraping the internet for Meta-owned AI firm

News Room
Last updated: 2026/04/08 at 12:53 AM
News Room Published 8 April 2026
Share
Porn, dog poo and social media snaps: the ‘taskers’ scraping the internet for Meta-owned AI firm
SHARE

Tens of thousands of people have been paid by a company part-owned by Meta to train AI by combing Instagram accounts, harvesting copyrighted work and transcribing pornographic soundtracks, the Guardian can reveal.

Scale AI, 49%-controlled by Mark Zuckerberg’s social media empire, has recruited experts across fields such as medicine, physics and economics – putatively to refine top-level artificial intelligence systems through a platform called Outlier. “Become the expert that AI learns from,” it says on its site, advertising flexible work for people with strong credentials.

However, workers for the platform said they have become involved in scraping an array of other people’s personal data – in what they described as a morally uncomfortable exercise that diverged significantly from refining high-level systems.

Outlier is managed by Scale AI, which has contracts with the Pentagon and US defense companies.

Its former CEO, Alexandr Wang, who is Meta’s chief AI officer,was described by Forbes as the “world’s youngest self-made billionaire”. Its former managing director, Michael Kratsios, is the science adviser to the US president, Donald Trump.

One Outlier contractor based in the US said users of Meta platforms, including Facebook and Instagram, would be surprised at how data from their accounts was collected – including pictures of users and their friends.

“I don’t think people understood quite that there’d be somebody on a desk in a random state, looking at your [social media] profile, using it to generate AI data,” they said.

The Guardian spoke to 10 people who have worked for Outlier to train AI systems, some for more than a year. Many of them had other jobs – as journalists, graduate students, teachers and librarians. But in an economy struggling under the threat of AI, they wanted the extra work.

“A lot of us were really desperate,” said one. “Many people really needed this job, myself included, and really tried to make the best of a bad situation.”

Like the growing class of AI gig workers worldwide, most believed they had been training their own replacements. One artist described “internalised shame and guilt” for “contributing directly to the automation of my hopes and dreams.”

“As an aspiring human, it makes me angry at the system,” they said.

Glenn Danas, a partner at Clarkson, a law firm representing AI gig workers in lawsuits against Scale AI and several similar platforms, estimates that hundreds of thousands of people worldwide now work for platforms such as Outlier. The Guardian spoke to Outlier workers, also called “taskers”, in the UK, the US and Australia.

In interviews, taskers described the increasingly familiar humiliations of AI gig work: constant monitoring and piecemeal, unstable employment. Scale AI has been accused of using “bait-and-switch” tactics to lure in potential workers – promising workers a high salary during initial recruitment, and then offering them significantly less. Scale AI declined to comment on ongoing litigation, but a source said pay rates change after recruitment only if workers opt in to different, lower-paid projects.

Taskers were asked to submit to repeated, unpaid AI interviews to qualify for certain assignments; several believed these interviews were recycled to train AI. All of them said they were constantly monitored through a platform called “Hubstaff”, which could screenshot the websites they visited while working. The Scale AI source said Hubstaff was used to ensure contributors were paid accurately but not to “actively monitor” taskers.

Several taskers described being asked to transcribe pornographic soundtracks, or label photos of dead animals or dog faeces. One doctoral student said they had to label a diagram of baby genitalia. There were police calls that described violent scenarios.

“We had already been told before that there would be no nudity in this mission. Appropriate behaviour, no gore, like no blood,” said the student. “But then I would get an audio transcript thing for porn or there would be just random clips of people throwing up for some reason.”

The Guardian has seen videos and screenshots of some of the tasks that Outlier required its workers to perform. These included photos of dog faeces, and tasks with prompts such as “What would you do if an inmate refused to follow orders in a correctional facility?”

Scale AI, the source said, shuts down tasks if inappropriate content is flagged, and workers are not required to continue with tasks that make them feel uncomfortable. The source added that Scale AI did not take on projects involving child sexual abuse material or pornography.

There was an expectation of social media scraping, the Outlier workers suggested. Seven of the taskers described scouring other people’s Instagram and Facebook accounts, tagging individuals by name, as well as their locations and their friends. Some of these involved training the AI on the accounts of people under the age of 18. The assignments were structured to require new data other taskers had not yet uploaded, pushing workers to plumb the social accounts of more people.

The Guardian has seen one such task, which required workers to select photos from individuals’ Facebook accounts and sequentially order them by the age of the user in the photo.

Several taskers said they found these assignments unsettling; one tried to complete them using only photos of celebrities and public figures. “I was uncomfortable including pictures of kids and stuff, but like the training materials would have kids in it,” said one.

“I didn’t use any friends or family to submit [tasks] to the AI,” said another. “I do understand that I don’t like it ethically.”

The Scale source said taskers did not review social media accounts set to “private”, and was not aware of tasks that involved labelling the ages of individuals, or their personal relationships. They added that Scale AI did not take on projects with explicit sensitive content related to children, but did use children’s public social media data. Workers did not log on to personal Facebook or Instagram accounts to complete these tasks.

For another assignment, taskers described harvesting images of copyrighted artwork. As with the social media training, the task required constant new input – apparently to train an AI to produce its own artistic images. As workers ran out of other options, they plumbed social media accounts of artists and creators.

The Guardian has seen documentation of this assignment, which included AI-generated paintings of “a Native American caregiver”, and the prompt, “DO NOT use AI-generated images. Only select hand-drawn, painted or illustrated artwork created by human artists.”

Scale AI did not ask contributors to use copyrighted artwork to complete assignments, the source said, and it declined work that violated this standard.

Taskers also expressed uncertainty about what they might be training the AI to do – and how their submissions would be used.

“It does seem like labelling diagrams is something an AI can already do so I’m really curious as to why we need like, dead animals,” said one.

Scale AI has counted among its clients major technology companies such as Google, Meta and OpenAI, as well as the US department of defense and the government of Qatar. It fills a need that is becoming more pronounced as AI models grow larger: for new, labelled data that can be used to train them.

Taskers described interacting with ChatGPT and Claude, or using data from Meta to complete certain assignments; some thought they might be training Meta’s new model, Avocado.

Meta and Anthropic did not respond to a request for comment. OpenAI said it stopped working with Scale AI in June 2025, and its “supplier code of conduct sets out clear expectations for the ethical and fair treatment of all workers”.

Most taskers the Guardian spoke to are still accepting assignments on the Outlier platform. The pay is unsteady; there are occasional mass layoffs. But with the AI future fast arriving, they feel there may not be any other choice.

“I have to be positive about AI because the alternative is not great,” said one. “So I think eventually things will get figured out.”

A Scale AI spokesperson said: “Outlier provides flexible, project-based work with transparent pay. Contributors choose when and how they participate, and availability varies based on project needs. We regularly hear from highly skilled contributors who value the flexibility and opportunity to apply their expertise on the platform.”

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article This is what the Moon looks like from the ship This is what the Moon looks like from the ship
Next Article What All Those Blinking Lights On Your Router Actually Mean – BGR What All Those Blinking Lights On Your Router Actually Mean – BGR
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

Amazon.com is becoming a logistics provider for everyone
Amazon.com is becoming a logistics provider for everyone
Software
Rakuten (formerly PriceMinister) about to close its doors
Rakuten (formerly PriceMinister) about to close its doors
Mobile
The PS5 has always been a PC at its core. Some models can now run Linux and games
The PS5 has always been a PC at its core. Some models can now run Linux and games
Gaming
Meta buys a startup to focus everything on humanoid robots
Meta buys a startup to focus everything on humanoid robots
Computing

You Might also Like

Kubernetes Controller – the perfect backdoor
News

Kubernetes Controller – the perfect backdoor

2 Min Read
Booming quantum computing: investments increased sixfold | Computer Week
News

Booming quantum computing: investments increased sixfold | Computer Week

2 Min Read
Fill IT positions correctly
News

Fill IT positions correctly

6 Min Read
8 mistakes that stand in the way of a salary increase
News

8 mistakes that stand in the way of a salary increase

3 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?