By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: Why Wall Street’s AI Bet May Be Dead Wrong
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > News > Why Wall Street’s AI Bet May Be Dead Wrong
News

Why Wall Street’s AI Bet May Be Dead Wrong

News Room
Last updated: 2025/07/27 at 11:19 AM
News Room Published 27 July 2025
Share
SHARE

Everyone’s watching the wrong AI boom.

While Wall Street and Silicon Valley obsess over when ChatGPT-5 will launch—or how many exaflops xAI is hoarding—they’re missing the real earthquake.

It’s already rumbling beneath the surface. And it’s about to crack the foundations of the AI world and reorder the entire semiconductor supply chain.

That quake? The silent, seismic shift from Large Language Models (LLMs) to Small Language Models (SLMs).

And it’s not theoretical. It’s happening now.

AI is leaving the cloud. Crawling off the server racks. And stepping into the physical world.

Welcome to the Age of Physical AI

If the past five years of AI were about massive brains in the cloud that could pass the bar exam and write poetry, the next five will be about billions of tiny, embedded brains powering real-world machines.

Cleaning your house. Running your car. Cooking your dinner. Whispering insights through your glasses.

This is AI going physical. And when that happens, everything changes.

Because physical AI can’t rely on 500-watt datacenter GPUs.

It can’t wait 300 milliseconds for a round trip to a hyperscaler.

It needs to be:

  • Always on
  • Instantaneous
  • Battery-powered
  • Offline-capable
  • Private
  • Cheap

And that means it can’t run GPT-4.

It needs SLMs—compact, fine-tuned, hyper-efficient models built for mobile-class hardware.

SLMs aren’t backup singers to LLMs.

In the world of edge AI, they’re the headliners.

The new AI revolution won’t be televised.

It’ll be embedded. Everywhere.

The SLM Invasion Has Already Begun

You may not have noticed it yet—because the companies deploying small language models are not bragging about billions of parameters or trillion-token training sets.

They are shipping products.

Apple’s (AAPL) upgraded Siri? Runs on an on-device SLM.

Meta’s (META) Orion smart glasses? Powered by locally deployed SLMs.

Tesla’s (TSLA) Optimus robot? Almost certainly driven by an ensemble of SLMs trained on narrow tasks like folding laundry and opening doors.

This is not a niche trend.

It is the beginning of the great decentralization of artificial intelligence—from monolithic, cloud-based compute models to lightweight, distributed intelligence at the edge.

If large language models were the mainframe era of AI, SLMs are the smartphone revolution. And just like in 2007, most incumbents do not see the freight train coming.

To be clear: LLMs are remarkable—but they are not scalable.

You cannot put a 70-billion-parameter model in a toaster. You cannot run GPT-4 on a drone.

SLMs, by contrast, are purpose-built for the edge. They:

  • Operate at sub-100 millisecond latency on mobile-class chips
  • Fit into just a few gigabytes of RAM
  • Deliver reliable performance for 90% of AI agent tasks (instruction following, tool use, commonsense reasoning)
  • Can be fine-tuned at low cost for narrow applications

They are not omniscient.

They are the blue-collar AI that gets the job done.

And in a world that needs AI agents in cars, robots, glasses, appliances, manufacturing lines, kiosks, and wearables—reliability and cost will beat generality and elegance every single time.

The Investment Implications: GPU Utopia Cracks

Now here is where it gets interesting.

For the past two years, the core AI investment thesis has been simple:

“Buy Nvidia (NVDA) and anything tied to GPUs—because large language models are eating the world.”

That thesis held up. Until now.

If small language models begin to dominate AI deployment, the model starts to break down.

Why?

Because SLMs do not require data centers. They do not need $30,000 accelerators. They do not consume 50 megawatts of cooling. They do not even rely on OpenAI’s API.

All they need is efficient edge compute, a battery, and a purpose.

And that changes everything.

The center of gravity in AI shifts—from cloud-based GPUs and training infrastructure to edge silicon, local inference, and deployment tooling.

This does not mean Nvidia loses.

It means the next trillion dollars in value could accrue somewhere else.

The New Infrastructure Stack for Physical AI

Let’s get specific. The LLM world runs on one kind of infrastructure. The SLM world needs a completely different stack.

Critically, SLMs are inexpensive to replicate and do not require constant API calls to function.

That is a direct threat to the rent-seeking software-as-a-service (SaaS) AI model—but a powerful tailwind for device original equipment manufacturers (OEMs) and edge compute firms.

And based on the chart above, you can start to see how this tectonic shift may play out across public markets.

Qualcomm (QCOM) looks like a major winner. Its Snapdragon AI platform already runs many SLMs. It is the ARM of the edge AI world.

Lattice Semiconductor (LSCC) could also benefit. The company produces tiny FPGAs—ideal for AI logic in low-power robots and embedded sensors.

Ambarella (AMBA) is another potential standout, with its AI vision SoCs used in robotics, surveillance, and autonomous vehicles.

Among the Magnificent Seven, Apple (AAPL) appears especially well positioned. Its Neural Engine may be the most widely deployed small AI chip on the planet.

Vicor (VICR) also deserves mention. It produces power modules optimized for tight thermal and power envelopes—key to edge AI systems.

On the other side of the ledger, several beloved AI winners could find themselves on the wrong side of this transition.

Super Micro (SMCI) may be vulnerable if inference shifts away from data centers and server demand softens.

Arista Networks (ANET) could face pressure as data center networking becomes less critical.

Vertiv (VRT) might see growth flatten if hyperscale HVAC demand slows.

Generac (GNRC) may be exposed to declining demand for backup power if the SLM trend reduces reliance on centralized compute.

This is how paradigm shifts happen.

Not overnight—but faster than most incumbents expect. And with billions in capital rotation along the way.

Build a Portfolio for the SLM Age

If you believe AI is moving from “text prediction in the cloud” to physical intelligence in the world, then your portfolio needs to reflect that.

Instead of chasing the same three AI megacaps everyone owns, focus on:

  • Edge chipmakers
  • Embedded inference specialists
  • Optics and sensing providers
  • Power management innovators
  • Robotics component suppliers

The mega-cap GPU trade isn’t dead. But it’s not the only game in town anymore.

Final Word

The reason this story isn’t everywhere yet is simple: it does not sound flashy.

“Small models” do not make headlines. But they are what will drive profits—because they are what will scale artificial intelligence to a trillion devices and embed it into the everyday fabric of human life.

The next wave of AI will not be about a single god-model ruling from the cloud.

It will be powered by millions of specialized, local models—each performing narrow tasks quietly, reliably, and efficiently.

That is where the growth is. That is where the infrastructure buildout is headed. And that is where investors should be positioning now—before Wall Street catches up.

So, where does this all lead?

Straight into what renowned futurist Eric Fry calls the Age of Chaos—a high-stakes era defined by converging disruptions in tech, geopolitics, and the economy that could make—or break—fortunes.

Fry is no stranger to this kind of moment. He has picked more than 40 stocks that went on to soar 1,000% or more, successfully navigating both bull and bear cycles.

Now he is back with what may be his most urgent call of the decade.

Fry just released his “Sell This, Buy That” blueprint for navigating today’s AI-fueled mania. Inside, he names seven tickers—four he says to sell immediately, and three he believes could deliver life-changing upside:

  • A little-known robotics firm whose revenue has surged 15x since 2019—and one Fry says could outmaneuver Tesla in the race for physical AI
  • A stealth e-commerce company he believes could become the next Amazon—with 700% growth potential
  • A safer AI play that could outperform Nvidia (NVDA) while protecting capital

He is giving away all three names and the research behind them—free.

Click here to access Eric Fry’s “Sell This, Buy That” trades for the Age of Chaos.

The AI boom is real. Earnings are soaring. But not all tech stocks are built to survive the next phase.

Fry believes the next 12 to 24 months could be the most volatile of our lifetimes.

And this may be your best chance to get positioned before the new economic order takes shape.

On the date of publication, Luke Lango did not have (either directly or indirectly) any positions in the securities mentioned in this article.

P.S. You can stay up to speed with Luke’s latest market analysis by reading our Daily Notes! Check out the latest issue on your Innovation Investor or Early Stage Investor subscriber site. Questions or comments about this issue? Drop us a line at [email protected].

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article The gang to pearly, kim possible… a 100 % retro channel on Disney+
Next Article Score a Samsung Galaxy Tab A9+ now that it’s back under $150
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

The Fujifilm X-E5 is a simple, familiar, and impressive travel camera
News
One of the hottest new browsers is also the best thing that’s happened to my YouTube experience
News
EU cloud provider group files complaint over Broadcom’s VMware acquisition – News
News
US and EU agree landmark trade deal after months of talks, Donald Trump says
News

You Might also Like

News

The Fujifilm X-E5 is a simple, familiar, and impressive travel camera

7 Min Read
News

One of the hottest new browsers is also the best thing that’s happened to my YouTube experience

10 Min Read
News

EU cloud provider group files complaint over Broadcom’s VMware acquisition – News

5 Min Read
News

US and EU agree landmark trade deal after months of talks, Donald Trump says

1 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?