OpenAI’s ChatGPT has several rivals, and one of the most prominent of them all is Elon Musk’s Grok. Recently, the AI giant had launched the Grokipedia, a Wikipedia alternative which no one asked for! But it seems like its competitor is even referring to it, even more than the users! Also Read: ChatGPT helped save a dog after vets gave just 5 percent chance of survival: Here’s what happened
A latest report by The Guardian, the latest model of ChatGPT is said to begun citing Grokipedia as a source for various topics. The report further claimed that ChatGPT’s GPT-5.2 cited the Grokipedia nine times in different queries related to the political structure of Iran, ownership of the Mostazafan Foundation, and even the British historian. It is not just about the citation, but the concerns about misinformation that ChatGPT may bring to the users.
Why is it concerning?
Grokipedia, launched by Elon Musk’s xAI in October, is an AI-generated encyclopedia, and that’s what the issue is. It doesn’t allow you to edit like Wikipedia. Since launch, it has already faced criticism for pushing questionable narratives on topics like gay marriage and the January 6 US Capitol attack. ChatGPT didn’t directly quote Grokipedia when users tried to push obvious misinformation. In fact, on high-profile topics where Grokipedia has been criticised before, GPT-5.2 avoided citing it.
For example, The Guardian noted that ChatGPT repeated stronger claims about Iranian government links to telecom firm MTN-Irancell than those found on Wikipedia, claims sourced from Grokipedia. In another case, it cited information about Sir Richard Evans’ role in the David Irving libel trial that The Guardian has previously debunked. This is exactly the kind of scenario disinformation researchers worry about.
Experts call this process “LLM grooming”, where misleading or poorly sourced content is pushed online at scale, hoping it quietly trains or influences large language models over time. Even if unintentional, once such material enters AI responses, removing it becomes extremely difficult.
