By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: I built a local coding AI for VS Code and it’s shockingly good
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > News > I built a local coding AI for VS Code and it’s shockingly good
News

I built a local coding AI for VS Code and it’s shockingly good

News Room
Last updated: 2025/09/25 at 1:26 PM
News Room Published 25 September 2025
Share
SHARE

AI models have made coding a far easier job, especially for beginners. Previously, you might have had to spend days or even weeks getting to grips with a programming language, let alone write functional code. Now, an entirely functional app is a single prompt away.

However, most online coding assistants are locked behind subscriptions—accessible only for a short time before you exhaust your limits. There are free AI tools that can save you money on subscriptions, but when it comes to coding, nothing beats a local AI coding assistant.

Local beats the cloud for coding AI

No latency, no limits, just pure coding speed

There are obvious benefits to using a local coding AI as compared to online options like ChatGPT or GitHub’s Copilot. For starters, you don’t have to pay a single cent in subscription costs. Pretty much every AI tool indeed gives some sort of free access to its users, but if you’re serious about coding, you’ll exhaust those free limits rather quickly.

Screenshot uploaded by Yadullah Abidi | No attribution required. 

In most cases, the free plan also sticks you with an inferior model compared to what a paid subscription would get. So not only is the amount of help you need limited, but the quality isn’t as good as it can be, either.

An experienced programmer who knows what they’re working with can perhaps get away with this, assuming they only use the AI for figuring out parts of their code. However, if you’re learning how to code or building an app from scratch, you’re going to need the best help you can get it, and a lot of it.

A local AI is also great for privacy. Online AI tools will almost always use your code to train their models, which is the reason why most of them are banned in professional environments. On the other hand, a local AI runs on your machine, is available around the clock, and your code or queries aren’t sent to a server for processing.

Local AI model running on VS Code.
Yadullah Abidi / MakeUseOf
Credit: Yadullah Abidi / MakeUseOf

As you’ll soon discover, they also aren’t very hard to set up. The only catch is that you’re limited by your hardware. Now you don’t need top-of-the-line rigs with GPUs that consume more power than your house to run most open-source models available on the internet for free, but having good hardware does help.

Essentially, the higher the number of parameters for the model, the more RAM, storage, and VRAM you need. There are lower fidelity models that can run on weaker hardware as well, but the response and code quality might not be as good as you’d like.

How I built my own AI coding sidekick

You don’t need a data center to make this work

The first step of the process is to get an AI model running locally on your system. There are plenty of apps to enjoy the benefits of a local LLM, such as Ollama and LM Studio. I recommend using LM Studio as it has a graphic interface that you can use to search and download models, configure them, and even chat with a model with support for file uploads.

Next is the AI model you’ll be running. LM Studio lets you download open-source AI models from HuggingFace. You can download models like the DeepSeek R1, Qwen, gpt-oss, and more. I use DeepSeek, but I recommend you experiment with different models based on your specific requirements and PC hardware.

Installing and setting up LM Studio with an AI model is a rather easy process. Download LM Studio from the official website and run the installer. Follow these steps after running LM Studio for the first time:

  1. You may be prompted to go through a setup wizard. This can be skipped by clicking the grey Skip button on the top-right.
  2. Once the main interface loads up, LM Studio should automatically start downloading any drivers or updates it needs. Wait for these to finish before proceeding.
  3. Click the magnifying glass icon to open the Discover tab and search for the model you want to download. Click the green Download button at the bottom left to proceed.
  4. Once the model is done downloading, head over to the Chat section and click the dropdown at the top of the display to load your downloaded model.

At this point, you should be able to chat with your model. If everything works as expected, you can now start a local server for the model to make it accessible to other programs on your PC.

  1. Head over to the Developer section. Make sure your downloaded model is loaded. You’ll see a green READY tab if it is. If it isn’t, select the model from the drop-down menu at the top.
  2. Head over to the Load tab on the right and set a context length. I use 15,000, but your number can vary based on total available memory and requirements. Reload the model to apply changes.
  3. Enable the Status slider at the top to start the local server.

If everything goes well, you’re now hosting an accessible instance of the AI model, which can be used by other applications on your PC as well as devices on your local network.

AI, meet VS Code

Feels like Copilot, but it’s 100% yours

Once you’ve got LM Studio running with your model of choice, it’s time to integrate your AI model into VS Code. Follow these steps:

  1. Open VS Code and install the Continue extension. You can search for it in the Extension Marketplace within VS Code.
  2. Click the Continue icon in the VS Code sidebar and click the settings gear icon for the extension.
  3. Head over to the Models settings and click the plus icon in the top right to add a model.
  4. Select LM Studio from the Provider drop-down. The Model drop-down should be set to Autodetect. Click Connect to proceed.
  5. You’ll be redirected to the Models settings page. Under the Chat model drop-down, you should see your downloaded model.

And that’s it. You should now be able to access your downloaded model from within VS Code. Every VS Code user needs to watch out for malicious extensions, so check before downloading to ensure you’re getting the right one. Continue has various modes that let you talk to the AI model in varying contexts and applications, so feel free to play around with it to find what works best for you. In most cases, using the Agent mode will get you the best results. There’s a handy tutorial built into the extension that’ll help you get to grips with all the features.

Performance largely depends on the AI model that you choose and your PC’s hardware. On my HP Omen Transcend 14 with a Core Ultra 7 155H, 16GB RAM, and an RTX 4060, most queries take less than 30 seconds to process. For larger queries or files, the response time can go into minutes. The generated code has been quite accurate, and I can mostly get it to work within a couple of prompts.

Whether this performance is better than an online coding tool worth $20 a month or more is up to you to decide. In my case, a local coding AI with reasonably accurate code generation and privacy protection is the most important factor. Being able to make your own local coding AI gives you the freedom to test multiple models to find what works best for you, without ever spending a dime.

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article Strategic Initiative Execution: Best Practices and Frameworks 2025
Next Article Amazon has slashed 50% off this Ring doorbell ahead of Big Deal Days
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

The Raspberry Pi 500+ – peak power and a mechanical keyboard
Gadget
What can you do with the Nigerian government’s new AI model?
Computing
Nscale secures record Series B funding – UKTN
News
OnePlus To Address This Software Flaw After Initial Silence
Software

You Might also Like

News

Nscale secures record Series B funding – UKTN

3 Min Read
News

Because the administration focuses on autism, many in the community say that they need support, not a ‘remedy’

12 Min Read
News

Trump expected to sign TikTok deal today — live updates on its status

1 Min Read
News

Shortcut to Savings: Cut Over 40% Off the Price of the Asus ROG Strix Scope II Keyboard

6 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?