By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: Anyone can try to edit Grokipedia 0.2 but Grok is running the show
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > News > Anyone can try to edit Grokipedia 0.2 but Grok is running the show
News

Anyone can try to edit Grokipedia 0.2 but Grok is running the show

News Room
Last updated: 2025/12/03 at 1:40 PM
News Room Published 3 December 2025
Share
Anyone can try to edit Grokipedia 0.2 but Grok is running the show
SHARE

Elon Musk envisions Grokipedia — xAI’s AI-generated, anti-woke spin on Wikipedia — as a definitive monument to human knowledge, something complete and truthful enough to etch in stone and preserve in space. In reality, it’s a hot mess, and it’s only getting worse now that anyone can suggest edits.

Grokipedia was not always editable. When it first launched in October, its roughly 800,000 Grok-written articles were locked. I thought it was a mess then, too — racist, transphobic, awkwardly flattering to Musk, and in places straight-up cloned from Wikipedia — but at least it was predictable. That changed a few weeks ago, when Musk rolled out version 0.2 and opened the door for anyone to propose edits.

Proposing edits on Grokipedia is simple, so simple that the site apparently doesn’t feel a need to give instructions on how to do it. You highlight some text, click the “Suggest Edit” button, and fill in a form with a summary of the proposed change, with an option to suggest content and provide supporting sources. Reviewing edit suggestions is Grok, xAI’s problematic, Musk worshipping AI chatbot. Grok, yes, the chatbot, will also be the one making actual changes to the site. Most edits on Wikipedia don’t require approval, but there is an active community of human editors who watch the “recent changes” page closely.

It’s not very clear what changes Grok is making, though. The system is confusing and isn’t very transparent. Grokipedia tells me there have been “22,319” approved edits so far, though I’ve no way of seeing what these edits were, on what pages they happened, or who suggested them. It contrasts with the well-documented editing logs on Wikipedia, which can be sorted by pages, users, or, in the case of anonymous users, IP addresses. My hunch is that many of Grokipedia’s edits are adding internal links to other Grokipedia pages within articles, though I’ve no firm evidence beyond scrolling through a few pages.

The closest I got to seeing where edits were actually happening was on the homepage. There’s a small panel below the search bar displaying five or so recent updates on a rotation, though these only give the name of the article and say that an unspecified edit has been approved. Not exactly comprehensive. These are entirely at the mercy of whatever users feel like suggesting, leading to a confusing mix of stories. Elon Musk and religious pages were the only things that seemed to come up frequently when I looked, interspersed with things like the TV shows Friends and The Traitors UK and requests to note the potential medical benefits of camel urine.

On Wikipedia, there is a clear timeline of edits outlining what happened, who did what, and the reasons for doing so, with viewable chat logs for contentious issues. There are also copious guidelines on editing style, sourcing requirements, and processes, and you can directly compare edited versions of the site to see exactly what changed and where. Grokipedia had no such guidelines — and it showed, many requests were a jumbled mess — but it did have an editing log. It was a nightmare that only hinted at transparency. The log — which only shows a timestamp, the suggestion, and Grok’s decision and often-convoluted AI-generated reasoning — must be scrolled through manually on a tiny pop-up at the side of the page with no ability to skip ahead or sort by time or type of edit. It’s frustrating, and that’s with only a few edits, and it doesn’t show where changes were actually implemented. With more edits, it would be completely unusable.

Unsurprisingly, Grok doesn’t seem to be the most consistent editor. It makes for confounding reading at times and edit logs betray the lack of clear guidelines for wannabe editors. For example, the editing log for Musk’s biographical page shows many suggestions about his daughter, Vivian, who is transgender. Editors suggest using both her name and pronouns in line with her gender identity and those assigned at birth. While it’s almost impossible to follow what happened precisely, Grok’s decision to edit incrementally meant there was a confusing mix of both throughout the page.

As a chatbot, Grok is amenable to persuasion. For a suggested edit to Musk’s biographical page, a user suggested “the veracity of this statement should be verified,” referring to a quote about the fall of Rome being linked to low birth rates. In a reply far wordier than it needed to be, Grok rejected the suggestion as unnecessary. For a similar request with different phrasing, Grok reached the opposite conclusion, accepting the suggestion and adding the kind of information it previously said was unnecessary. It isn’t too taxing to imagine how one might game requests to ensure edits are accepted.

While this is all technically possible on Wikipedia, the site has a small army of volunteer administrators — selected after a review process or election — to keep things in check. They enforce standards by blocking accounts or IP addresses from editing and locking down pages in cases of page vandalism or edit wars. It’s not clear Grokipedia has anything in place to do the same, leaving it completely at the mercy of random people and a chatbot that once called itself MechaHitler. The issue showed itself on several pages related to World War II and Hitler, for example. I found repeated (rejected) requests to note the dictator was also a painter and that far fewer people had died in the Holocaust than actually did. The corresponding pages on Wikipedia were “protected,” meaning they could only be edited by certain accounts. There were also detailed logs explaining the decision to protect them. If the editing system — or site in general — were easier to navigate, I’m sure I’d find more examples.

Pages like these are obvious targets for abuse, and it’s no surprise they’re among the first hit by malicious editors. They won’t be the last, and with Grokipedia’s chaotic editing system and Grok’s limited guardrails, it may soon be hard to tell what’s vandalism and what isn’t. At this rate, Grokipedia doesn’t feel poised for the stars, it feels poised to collapse into a swamp of barely readable disinformation.

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.

  • Robert Hart

    Robert Hart

    Posts from this author will be added to your daily email digest and your homepage feed.

    See All by Robert Hart

  • AI

    Posts from this topic will be added to your daily email digest and your homepage feed.

    See All AI

  • Report

    Posts from this topic will be added to your daily email digest and your homepage feed.

    See All Report

  • Tech

    Posts from this topic will be added to your daily email digest and your homepage feed.

    See All Tech

  • xAI

    Posts from this topic will be added to your daily email digest and your homepage feed.

    See All xAI

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article Top AI companies’ security practices are falling short, according to a new report Top AI companies’ security practices are falling short, according to a new report
Next Article Arm Launches AI-Powered Copilot Assistant to Migrate Workflows to Arm Cloud Compute Arm Launches AI-Powered Copilot Assistant to Migrate Workflows to Arm Cloud Compute
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

Cisco takes a step forward in its network security strategy with Resilient Infrastructure
Cisco takes a step forward in its network security strategy with Resilient Infrastructure
Mobile
The HackerNoon Newsletter: Porting Scientific Algorithms from MATLAB to JavaScript (12/3/2025) | HackerNoon
The HackerNoon Newsletter: Porting Scientific Algorithms from MATLAB to JavaScript (12/3/2025) | HackerNoon
Computing
How Discord Scaled Its ML Platform from Single-GPU Workflows to a Shared Ray Cluster
How Discord Scaled Its ML Platform from Single-GPU Workflows to a Shared Ray Cluster
News
How to make the most of Instagram carousels in 2025 [GUIDE]
How to make the most of Instagram carousels in 2025 [GUIDE]
Computing

You Might also Like

How Discord Scaled Its ML Platform from Single-GPU Workflows to a Shared Ray Cluster
News

How Discord Scaled Its ML Platform from Single-GPU Workflows to a Shared Ray Cluster

4 Min Read
Apple design boss Alan Dye departing for Meta – 9to5Mac
News

Apple design boss Alan Dye departing for Meta – 9to5Mac

3 Min Read
Best Robot Vacuums: Our Latest Lab Testing Reveals Surprising New Contenders
News

Best Robot Vacuums: Our Latest Lab Testing Reveals Surprising New Contenders

7 Min Read
UK nuclear body claims ‘urgent action’ on energy is needed to win the AI race – UKTN
News

UK nuclear body claims ‘urgent action’ on energy is needed to win the AI race – UKTN

3 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?