I’ve tested more AI tools than I can count, but only these four open-source gems made the cut because they’re easy to use, private, and free.
LocalAI is an open-source app that lets you run large language models on your hardware. You can run a wide variety of models locally on your device by choosing from its library of over 1,100 options.
The only downside is that it doesn’t have a dedicated desktop app. Instead, you have to install and run the model via the command line. After installation, you can easily install new models by running the local-ai models install [model identifier] command, then run any downloaded model using the local-ai run [model identifier] command.
Once a model is running, LocalAI gives you a user-friendly web interface to interact with the model (as shown in the image above). From the web UI, you can configure custom prompts, browse available models, and install them.
AnythingLLM is an open-source app designed to make working with large language models (LLMs) flexible and accessible. Because of its intuitive interface, you can download and run LLMs locally without any technical skills. It supports multiple models, including popular options from OpenAI, Anthropic, Mistral, and local deployments through Ollama or LM Studio.
A key feature of AnythingLLM is its workspaces. These let you upload, organize, and manage documents, then interact with the AI to get context-aware responses directly from your data. For example, you can upload a lecture PDF and ask the model to extract key points. This makes it especially useful for tasks like research, knowledge management, customer support, or any scenario where you need fast, relevant answers based on custom data. It also integrates with vector databases to ensure quick and accurate retrieval of relevant content.
Since it’s open-source, you can self-host AnythingLLM, keeping sensitive data completely under your control. This privacy focus and customizability make AnythingLLM one of my go-to open-source AI apps. I use the app on macOS, but you can also download it on Windows and Linux. The team is also working on a mobile app that will enable you to run LLMs on your smartphone.
If you’re looking for a ChatGPT-like open-source alternative, Jan.ai is the answer. Jan.ai is an open-source, privacy-first AI chat app that runs entirely on your local machine. Instead of relying on cloud servers, it uses local LLMs, meaning your conversations and data never leave your computer. This makes it an excellent choice for privacy-conscious users who want AI capabilities without compromising sensitive information.
The app offers a clean, distraction-free interface where you can chat with AI models much like you would with ChatGPT. However, since it’s powered locally, performance depends on your computer’s hardware—stronger GPUs and more RAM allow for faster, more complex responses. Jan.ai supports a variety of models, letting you choose the one that suits your needs based on the task at hand.
When you install the app, it downloads a model so you can start chatting instantly. However, you can head over to the Hub section of the app to download any open-source model that you want. The app’s Hub section lists a variety of models from different model providers, including HuggingFace, my go-to site for discovering new AI models.
Although Jan.ai is offline-first, it gives you the option to connect to commercial models that run in the cloud, like those from OpenAI, Gemini, Groq, and Anthropic. Similar to AnythingLLM, you can also upload files and get insights from them, which is handy for research and studying.
Ollama is another open-source AI app that lets you run LLMs on your hardware. You can pick a model from Ollama’s diverse model library that includes popular choices like Llama 3, Mistral, Gemma, DeepSeek-R1, Phi, Code Llama, and even the best offline LLMs.
For convenience, Ollama has a desktop app (available on Windows, macOS, and Linux), which makes it easy for anyone to use. The interface is familiar to other apps like ChatGPT, with a chat history on the left and the main interface on the right.
You can select a model to use from the interface, and Ollama gives you an option to search on the web, a rare feature among open-source AI apps. The only downside is that you must be signed in to use the web search feature.
From the interface, you can upload files via drag-and-drop and ask the models about them, as long as the selected model supports multimodal input. For large documents, you can increase the context length in the app’s settings by up to 128,000 tokens.
If you’re a developer, you can install and interact with Ollama via the command line, then run models using the ollama run [model identifier] command.
Open-source AI apps like AnythingLLM, Jan.ai, LocalAI, and Ollama prove you don’t have to sacrifice privacy or control to harness the power of AI. These apps make it easy to run AI models locally regardless of your hardware. Most importantly, they offer sleek interfaces that rival proprietary apps like Gemini, ChatGPT, and Copilot.