Although Threads (Instagram’s Twitter) has more users, it is Bluesky that seems to be emerging as the great alternative to X. Recent events that have taken place in the United States and on Elon Musk’s platform have caused many users to abandon X , which has translated into very important growth for the platform. Also in an enormous challenge related to a capital aspect in any social network: moderation.
The growth of Bluesky. Over the last few days, the platform has grown to 19 million users, more or less at a rate of one million a day. It is a figure that demonstrates the real interest in finding a feasible alternative to X. What is happening? That more users means more problems.
The reports. On Bluesky, anyone can report content that they consider dangerous, illegal, or that violates community standards. According to details from the platform, in just 24 hours they have received 42,000 reports, that is, 3,000 complaints per hour. It is the historical record and a figure that may seem low if we compare it with the 19 million registered users, but that is better understood with another piece of information: in 2023 alone, Bluesky received 360,000 reports in total.
In other words, Bluesky has received in a single day 11.6% of the reports it received throughout last year. A technical challenge if there ever was one.
In the past 24 hours, we have received more than 42,000 reports (an all-time high for one day). We’re receiving about 3,000 reports/hour. To put that into context, in all of 2023, we received 360k reports.
We’re triaging this large queue so the most harmful content such as CSAM is removed quickly.
— Bluesky Safety (@safety.bsky.app) November 15, 2024, 18:10
That must be stopped. From the platform, which has 20 full-time employees, they state that “with this significant influx of users, we have also observed an increase in spam, scams and trolls.” They also claim to be “classifying this large queue (of reports) so that the most harmful content, such as child sexual abuse, is quickly removed.”
The first measure. Force the newly registered user to verify their email before being able to publish, as simple as that. In this way, the platform aims to prevent users with bad intentions from registering with throwaway emails. Enforcing verification could reduce risks. It does not eliminate them, but it does increase the work for anyone who wants to create many accounts en masse using temporary emails.
How to avoid an X2. Social media is not bad in itself, but it can have a harmful effect when certain behaviors are allowed and/or moderation is removed or reduced. In 2022, Elon Musk cut 80% of Twitter‘s workforce and since October 2023, X has reduced its content moderation resources by 20%. The latest data available, corresponding to April 2024, shows that X has 1,849 people moderating the platform. We are talking about one moderator for every 60,200 users. Meta, to put in context, has 15,000 moderators, one for every 17,600 users.
Add to that a CEO who defends his concept of “freedom of expression” above all else (making blocks useless, for example), who engages in behavior that violates the terms of his own platform, and who has taken a series of antibot measures that have not been effective. Bluesky does not want to become what X has become and, to do so, it offers some interesting tools.
Bluesky against the trolls. One of the nicest tools is untagging quotes. If someone quotes one of our posts and takes advantage of it to, for example, make fun of us or hurt us, we can undo that quote and our post will not appear under theirs. Nothing prevents taking a screenshot for the same purposes, but at least the tool exists, which is no small thing.
The other is the user list for mass blocking. These lists, which anyone can create, serve to bring together in the same list all the users who talk about certain topics, behave in this or that way, etc. Let’s say that we do not want to read, see, listen to or have any interaction with accounts that spread hoaxes about chemtrails, pseudoscience, flat earthism, etc. Well, we can put all the profiles that talk about those topics in a list and silence or block them en masse. What’s more, we can share that list publicly so that anyone can do so.
This allows us to keep our feed clean and away from profiles that may have harmful behavior. Lockdowns, by the way, restrict any type of interaction. The blocked user will not be able to like, mention, reply or follow us and their profile and/or publications will not be shown in our feeds.
Images | WorldOfSoftware
In WorldOfSoftware | I have been on Threads since its launch and against all odds it seems to me to be a much friendlier platform than Twitter