By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: The Truth Algorithm: From Fire Circles to ChatGPT, a History of Thought Engineering | HackerNoon
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > Computing > The Truth Algorithm: From Fire Circles to ChatGPT, a History of Thought Engineering | HackerNoon
Computing

The Truth Algorithm: From Fire Circles to ChatGPT, a History of Thought Engineering | HackerNoon

News Room
Last updated: 2025/06/23 at 12:47 PM
News Room Published 23 June 2025
Share
SHARE

Delivering truth was never about facts. Throughout history, from traditions to search engines and now language models, there has always been an algorithmic gatekeeper. Not necessarily deliberate, or digital, or expected.


In the book 1984, the protagonist Winston Smith works for the Ministry of Truth. His job is to rewrite historical records so they match the Party’s current narrative. As Orwell wrote:

“Day by day and almost minute by minute the past was brought up to date… nor was any item of news, or any expression of opinion, which conflicted with the needs of the moment, ever allowed to remain on record.”

It was 1948 when Orwell wrote those words. Alternative truth is not a modern invention. What’s changing is the mechanism. What Orwell described as bureaucratic editing, we now recognize as algorithmic alignment, an evolving system that shapes what we perceive as true by optimizing toward a goal. These algorithms have existed long before code was invented, throughout all of history.

Two things are worth noting. First, while our cognitive wiring hasn’t evolved much, the algorithms for transferring and shaping information have changed radically across eras. Second, these algorithms aren’t necessarily engineered by an evil mastermind. Often they emerge from a hodge-podge of ego, culture, economy, and technology.

What is an algorithmic truth-control system?
A truth-control algorithm is any structured, goal-driven method, whether emergent or intentional, that filters, prioritizes, or frames information in a way that directs the perception of reality at scale. Code is an obvious example, but religion, fashion, and news editors also fall into this category. What they share is alignment: optimizing the communication system to meet a goal. Whether that goal is cohesion, power or profit, it bends what people perceive as “true.”

Prehistory to 10,000 BC: Age of Relatives

First cave algorithm.First cave algorithm.

Before agriculture, humans lived in small bands of up to ~150 people. What scholars refer to as the Dunbar Number, a threshold beyond which human social cognition degrades. Reality perception was optimized for what humanoids excelled at. A resourceful, long-distance-running social animal. Focused on skill sharing, social bonding and group status. The “truth” lived in the mouths of parents, uncles, grandmothers, tribe elders, and the fireside tales repeated through generations.

Truth algorithm: Social Mechanisms

Result: Evolution-aligned cohesion

10,000 BC to 1500 AD: Mythological Leaders

Thou shalt be amazed.Thou shalt be amazed.

Agriculture led to settlement and surplus. Surplus led to population growth. Humans needed to organize beyond the Dunbar threshold, and the able and inclined got the opportunity to “cash out” on status. This gave rise to the supernatural story. Gods, empires, and divine kings became the anchors of truth. Myth held large groups together and dictated what is the “new-true”.

Truth algorithm: Narrative Centralization

Result: Obedience through story

1500 to 1850: Many Books, Much Illiteracy

“In my whole life, I have known no wise people (over a broad subject matter area) who didn’t read all the time—none, zero.” Charlie Munger“In my whole life, I have known no wise people (over a broad subject matter area) who didn’t read all the time—none, zero.” Charlie Munger

Gutenberg’s printing press (~1450) ignited an information explosion. By 1500, over 20 million books had been printed, more than in the entire previous millennium. But while the quantity of knowledge exploded, access to it did not. Most people remained illiterate, and for them, the old algorithm of myth, channeled through religion, monarchy and nationalism, still held sway.

This created a split in the truth landscape: for the literate few, new worldviews emerged, sparking the Scientific Revolution, Political Reformation and the Enlightenment. For the majority, perception remained rooted in the centralized narratives of divine and national authority.

Truth algorithm: Decentralized Authorship

Result: Competing worldviews

1850 to 2000: Curators of News

The Old Gray Lady will tell you what you need to know.The Old Gray Lady will tell you what you need to know.

The birth of journalism, first through newspapers then radio and television, initiated the ability to curate on a daily basis the truth perceived by the public. Curation came with incentives. What made the front page wasn’t just fact; it was intention. Political and economic drama, scandals, crime and war made headlines because they were rare and emotionally charged; everyday acts of resilience or nuanced trends did not. They lacked immediacy and were harder to reduce into digestible headlines.

Editors became the new algorithm. Optimizing for attention and sensational value. As literacy rates exploded in this period, the influence of editorial choices vastly outpaced the slower, more distributed influence of books in the previous era.

Truth algorithm: Editorial Curation

Result: Shaped mass opinion

2000 to 2015: Personalized Web

Genius UI. Not just PageRank.Genius UI. Not just PageRank.

The shift of information creation and gathering from physical to digital birthed a new truth-shaping algorithm. Search. Ranking a small number of results by personalized relevance.

Google Search changed the way people decide what to trust. Instead of asking, “Is this source reliable?” people began to assume, “If it’s at the top of Google, it must be true.” This shift gave Google’s algorithm and its user interface (which results appeared, how many, and how they were framed) an enormous influence over public understanding.

The outcome was a widespread overconfidence in limited or one-sided information. It made popular or commercially optimized content seem like objective truth, and gradually pushed aside alternative or local viewpoints, even when those were more relevant or accurate.

Unlike social media (discussed below), which often drives emotional reactions, Google’s impact was quieter: it made people more certain and uniform in their beliefs, not by convincing them directly, but by shaping what they saw first, and what they didn’t see at all.

Truth algorithm: Search & UI

Result: Flattened complexity

2015 to 2025: Social (Anxiety)

You might want to consider cancelling your notifications.You might want to consider cancelling your notifications.

Social media changed the truth again. The stream of facts ceased “caring” about minimization, it became interested in emotions. Platforms like Facebook, Instagram, Twitter, TikTok and the good-old news optimize for time-on-site and monetization. What keeps you scrolling? Joy, humor and awe on the bright side, anxiety, outrage and envy on the dark side. This is part of the algorithm goal. Important to note the algorithm does not “care” or aim for emotions. This skewing of reality emerged, it was not engineered.

You can probably guess which emotions the algorithm ended up preferring. Again, it doesn’t come from being gloomy, it comes from being tuned to generate revenues. Facebook’s own internal research (leaked in 2021) showed that posts generating negative affect, especially insecurity or anger, correlated with higher ad click rates. Other research and economic analysis supported these findings. It appears that triggering fear or self-comparison (envy) nudges users into identity-based purchasing behavior, buying things that restore perceived control, beauty, or safety.

Truth algorithm: Emotional Engagement

Result: Tribalization, anxiety

2025 to 2030: Sycophantic GPTs

Never underestimate the power of UI. Again.Never underestimate the power of UI. Again.

The recent rise of AI brought a new kind of truth-bending algorithm to our lives. It is not a “ghost in the machine”. And not even the LLM model under the hood. It is the Chat in ChatGPT. Back in 2021, researchers from Google found that training LLMs to respond in ways preferred by humans made the models more usable. OpenAI (and others) implemented this into their (then) open source GPT models and the rest is history. Add to this RLHF (Reinforcement Learning from Human Feedback. Those “which answer do you prefer?” and thumbs up/down) and you get an algorithm trained to be liked by humans.

Let’s repeat that. Not optimized for truth, but for likeability.

Sycophantic (using flattery in order to gain favor) GPTs are trained to avoid conflict, soften disagreement and reinforce user assumptions. Because this is what makes users return. This “you are amazing” feature works. People are already using GPTs primarily as guides on how to live and prefer GPTs to other sources of reliance.

It is still early in the GPT era. Business models are being developed. But we can speculate goals based on capitalism and psychology. Commercial companies are not really interested in returning users and time on site for its own sake. They are interested in revenues and profits. So what does the psychological research tell us of the side effects of sycophancy in relation to generating revenues? In one word – Compliance. The psychologists Crowne & Marlowe explained it in their 1960s foundational work:

“Dependence on the approval of others should make it difficult to assert one’s independence, and so the approval‑motivated person should be susceptible to social influence, compliant, and conforming.”

In other words: when the chat will start telling you what to buy, don’t be surprised if you feel compelled to obey. “It was so kind to me all this time. I’ll do what it asks”.

Truth algorithm: Sycophantic Dialogue

Result: Compliance

2030 to 2032: Auto-Passivity

Speculative but grounded.Speculative but grounded.

Following the asymptotic trajectory of change, we can expect the next major algorithmic shift in about 5 years. It is a “distant” future, but I will venture a bit of speculation based on where the trend seems to be heading.

Enter: automation.

Your car will drive itself, grocery shopping, planning, booking, calendar, emails, minor negotiations (mobile carrier, gas company, insurance), scheduling dates, will all happen by an algorithm.

Some doom-seer researchers argue that predictive systems remove “friction” from decision-making and it is a bad thing. As where friction is, reflection happens. If everything is auto-completed, thoughts, plans, purchases, we get agents but lose agency.

I politely disagree.

I think the passivity algorithm will do two things. Yes – it will allow the controllers to direct what the autonomous agents are doing, what we buy, where we go on vacation. But it will be like shopping from a shopping list. If we are freed from having emotions involved in many decisions, we become less “decision fatigued”. And this is a very good thing. We will make better, less impulsive and less manipulated decisions, where it actually matters to us.

Truth algorithm: Passive Delegation

Result: Decision fatigue relief

Wanted: Algosubjectivity

It is rather straightforward.It is rather straightforward.

Algorithm changes. Multiple truths. Few things emerge:

  1. The pace of algo-truth modification is accelerating. From millennia, to centuries, to decades to years.
  2. The conductor of most, if not all algorithms, from the source to the minds of people, has been language. Truth can be out there, but when mediated through language, it becomes sensitive to the algorithm which modifies it towards the receiving end.
  3. An algorithm, even digital, does not need to be a sophisticated code. User Interfaces are very powerful manipulators (as seen in search and chat-based AIs).
  4. Emergent truth-control algorithms don’t announce themselves. They emerge from incentives, profit, attention, comfort, and they “ride” our cognitive flaws.

If we see them as a problem, the solution is not to preach for building “safer” algorithms. This is naive wishful thinking.

The solution would be in personalizing truth. Having language-bypassing tools that will manipulate our interaction with the world towards our subjective interests. Prehistoric, ancient, medieval, and recent people did not have the ability to filter what was fed to them. We might be on the verge of change.

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article Donald Trump and Sean Hannity Set Off a Wave of Disinformation After Iran Bombing
Next Article Microsoft Azure Enhances Observability with OpenTelemetry Support for Logic Apps and Functions
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

So You Bought a Nice New Mattress—Now It’s Time to Protect It
Gadget
Google’s pulling money away from Android TV and Google TV
News
Oppo seeks trademark registration for “ophone” · TechNode
Computing
iPhone 17 Pro, iPhone 17 Pro Max Tipped To Get This Android-Like Thermal System
Mobile

You Might also Like

Computing

Oppo seeks trademark registration for “ophone” · TechNode

1 Min Read
Computing

Know Your Product: A Practical Guide to Functional Decomposition | HackerNoon

14 Min Read
Computing

Chinese startup Sharge unveils first mass-produced AI glasses in China ahead of Xiaomi and Baidu · TechNode

1 Min Read
Computing

How to Create a Customer Feedback Loop for Business Growth

35 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?