By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: Here’s why I ditched ChatGPT and moved to local AI
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > Gadget > Here’s why I ditched ChatGPT and moved to local AI
Gadget

Here’s why I ditched ChatGPT and moved to local AI

News Room
Last updated: 2025/12/16 at 11:09 PM
News Room Published 16 December 2025
Share
Here’s why I ditched ChatGPT and moved to local AI
SHARE

I was one of the first people to jump on the ChatGPT bandwagon. The convenience of having an all-knowing research assistant available at the tap of a button has its appeal, and for a long time, I didn’t care much about the ramifications of using AI. Fast forward to today, and it’s a whole different world. There’s no getting around the fact that you are feeding an immense amount of deeply personal information, from journal entries to sensitive work emails, into a black box owned by trillion-dollar corporations. What could go wrong?

Now, there’s no going back from AI, but there are use cases for it where it can work as a productivity multiplier. And that’s why I’ve been going down the rabbit hole of researching local AI. If you’re not familiar with the concept, it’s actually fairly simple. It is entirely possible to run a large language model, the brains behind a tool like ChatGPT, right off your computer or even a phone. Of course, it won’t be as capable or all-knowing as ChatGPT, but depending on your use case, it might still be effective enough. Better still, no data leaves your device, nor is there a monthly subscription fee to consider. But if you’re concerned that pulling this off requires an engineering degree, think again. In 2025, running a local LLM is shockingly easy, with tools like LM Studio and Ollama making it as simple as installing an app. After spending the last few months running my own local AI, I can safely say I’m never going back to being purely cloud-dependent. Here’s why.

Have you considered running a local AI on your computer?

220 votes

Privacy

Dhruv Bhutani /

We’ve all pasted something into ChatGPT that we probably shouldn’t have. Perhaps it was a code snippet at work that you were trying to make sense of. Or perhaps a copy of a contract, maybe some embargoed information, or even just a really personal journal entry that you don’t feel comfortable exposing to our corporate overlords. Every time you hit send on a cloud-based AI, that data is processed on an entirely opaque server that will inevitably use your data for the greater good of AI-kind.

Grammar checking

AnythingLM grammar check

Dhruv Bhutani /

I write for a living, and as my managing editors would attest, a grammar check is a necessary evil for the job. But tools like Grammarly are, by definition, keyloggers. To get anything done, they have to read and process whatever you write. That might be fine for a quick email, but I’m not comfortable using it when working with privileged information. Moreover, to get proper utility out of these tools, you need to sign up for a subscription. The cons definitely outweigh the pros.

So, my solution has been to use one of the many offline LLMs to run as a high-powered spellchecker for me. For example, you can install LM Studio running a model like Llama 3 or Mistral, and give it a highly detailed command telling it to fix grammar and spelling without changing the tone or phrasing. That’s all it takes. I can paste in entire articles into the local LLM and get instant feedback and corrections on my writing. You can even train it on specific style guides, like, say, a preference for or against Oxford commas, and the local LLM will follow those rules. Effectively, you can trust a local LLM to work as a second set of eyes without any fear of information leaking out to the broader web.

Coding

Anything LM code checking

Dhruv Bhutani /

Much has been said about the spaghetti code generated by LLMs. However, it is hard to deny that it can be useful for generating boilerplate code to give you a head start. Look, I’m a fairly rusty engineer fifteen years out of university who still likes to dabble in code once in a while over the weekend. For someone like me, AI has been incredible at making sense of code snippets from open-source apps. Turns out, I don’t really need Copilot or Claude for this. For basic stuff, I can quite literally just dump a code fragment into Ollama and ask the AI to explain the logic to me. When it comes to writing code, I’ve dabbled in using extensions for VS Code that can plug Ollama right into the code editor.

Just download a coding-specific model, like, say, DeepSeek Coder locally. The LLM offers suggestions to autocomplete code, refactoring, and explanations for bugs when things inevitably fail to compile. And it does all of this within the IDE itself without phoning back home. It’s the GitHub Copilot experience, but running entirely on my computer. For developers, this means that they could be working on private projects or client code with absolute certainty that they’re not leaking intellectual property to a public-facing LLM. Plus, you might just notice an improved user experience when working with a developer-focused LLM versus a general-purpose model.

Uncensored creativity

ChatGPT censorship

Dhruv Bhutani /

Here’s where things get interesting. Public-facing AI models are built with guardrails in mind. Anything deemed too extreme or potentially dangerous is effectively blocked off from access. This can be problematic if you’re a naturally curious person, or if you want to do some research that goes a step beyond what ChatGPT can offer. In my case, one of my weekend hobbies is to write gritty thriller short stories, and when I want to do some research for a thriller or horror arc, ChatGPT just doesn’t go all that far.

However, if you are running a local LLM, you can download an unlocked model that is designed to follow instructions without moralizing. This can unlock creative freedom by letting you develop characters with whatever kind of personality you are aiming for. The same freedom can also be useful if you dabble in board games and want to generate a gruesome new DnD campaign.

It works when ChatGPT won’t

Ollama models list

Robert Triggs /

I travel a lot and prefer to have most of my commonly used services be available offline. Predictably, reliance on cloud AI can become a major pain point the moment you step on a plane or visit a remote location with spotty internet. Even if you have WiFi on a plane, the latency can often lead to connection dropouts mid-query. It’s not a very good user experience.

My local AI setup, on the other hand, travels with me. I’ve got Ollama set up on my laptop, and I can spin it up whenever I need it. As geeky as that might sound, it’s been a revelation for productivity, as I can get pretty much all the things I use ChatGPT for running on my computer at 36,000 feet in the air. If you want to take it a step further, you could even use an app like SmolChat to run a smaller language model on your Android phone and use it for simpler tasks. There’s a lot of flexibility here.

Subscription fatigue

DeepSeek LM Studio setup

Robert Triggs /

In an era where everything from media consumption to reading the news is becoming a subscription, it should come as no surprise that getting a decent AI experience is also, you guessed it, locked behind a subscription. I understand that building and running an AI model is extremely expensive. However, no single GPT is good at every task, and if I started running up the numbers, that $20 a month fee for every LLM I sign up for starts looking very spendy very quickly.

Local AI is the buy-it-for-life alternative. The only real cost is the hardware that you run it on. If you’re on a PC, you’ll want a beefy graphics card. On the Apple side of things, even a MacBook Air is able to run reasonably sized models at a decent clip. The open-source models themselves are free. In fact, if you have a recent computer, you likely already have all the hardware you need to get your own local AI instance up and running. Moreover, while you have to wait for OpenAI to release its yearly upgrade to ChatGPT, you can test out new models and variations practically weekly via Hugging Face.

Speed and latency

LMStudio Deepseek custom google sheet

Dhruv Bhutani /

While your computer is obviously not going to compete with the massively overpowered data centers running ChatGPT, depending on how you use these LLMs, you might observe that the gap is smaller than you’d imagine. Running locally on reasonably powered hardware, things like text generation and summarisation can often happen just as quickly as a commercial tool, and sometimes even quicker. Additionally, you don’t have to face the issue of latency or server load. If you’ve used ChatGPT with any frequency, you’ve inevitably come across a processing wheel, or sometimes even the dreaded connection timeout. None of those are things when your AI model runs on your computer.

Is running a local AI model for everyone?

Look, I’m not gonna sugarcoat it. Local AI isn’t perfect, nor is it for everyone. If you are using Claude to solve complex mathematical problems, or want the absolute state-of-the-art reasoning capabilities available, a massive cloud model like ChatGPT 5.1 or Claude is going to be your best bet. They simply have a much larger parameter base to work from than what can fit on your computer. Nor is your computer powerful enough to run them. But you don’t necessarily need that power on a day-to-day basis. If my experience is anything to go by, 90% of my daily AI tasks like summarisation, editing, code explanation, and brainstorming work perfectly fine with local models.

Is it a nerdy activity? Yes. Is it fun to dabble in? Also yes. It’s a bit like the early wild, wild west days of the internet, except entirely in your control.

Don’t want to miss the best from ?

google preferred source badge light@2xgoogle preferred source badge dark@2x

Thank you for being part of our community. Read our Comment Policy before posting.

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article Getting Back to Basics: How Mohammad Hamid Is Reframing Analytics for the Mid-Market AI Era in Detroit Getting Back to Basics: How Mohammad Hamid Is Reframing Analytics for the Mid-Market AI Era in Detroit
Next Article Got a 'Stranger Things' Superfan in Your Life? Check Out the Best Holiday Gifts for 2025 Got a 'Stranger Things' Superfan in Your Life? Check Out the Best Holiday Gifts for 2025
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

Abra launches in-flight connectivity through SES multi-orbit craft | Computer Weekly
Abra launches in-flight connectivity through SES multi-orbit craft | Computer Weekly
News
OnePlus 15R, 15R Ace Edition Launching In India Today: Expected Price And Specs
OnePlus 15R, 15R Ace Edition Launching In India Today: Expected Price And Specs
Mobile
Your Ears Will Thank You: 22 Genius Tips to Boost Your AirPods’ Sound and Fit
Your Ears Will Thank You: 22 Genius Tips to Boost Your AirPods’ Sound and Fit
News
Tencent Raises .27 Billion in Offshore Yuan Bonds as AI Spending Slows · TechNode
Tencent Raises $1.27 Billion in Offshore Yuan Bonds as AI Spending Slows · TechNode
Computing

You Might also Like

Getting Back to Basics: How Mohammad Hamid Is Reframing Analytics for the Mid-Market AI Era in Detroit
Gadget

Getting Back to Basics: How Mohammad Hamid Is Reframing Analytics for the Mid-Market AI Era in Detroit

7 Min Read
Navigating the World of Non-VBV Bins: Security, Risks & Ethical Insight
Gadget

Navigating the World of Non-VBV Bins: Security, Risks & Ethical Insight

2 Min Read
How Emotional Intelligence Is Becoming the Next Frontier of Artificial Intelligence
Gadget

How Emotional Intelligence Is Becoming the Next Frontier of Artificial Intelligence

9 Min Read
iPhone 17e change will eradicate the iPhone 16e’s biggest flaw
Gadget

iPhone 17e change will eradicate the iPhone 16e’s biggest flaw

2 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?