Newlle as a virtual AI assistant for the GNOME desktop with API integration for Google Gemini, OpenAI, Groq, and also local LLMs is out with a new release. Newelle has been steadily expanding its AI integration and capabilities and with the new Newelle 1.2 are yet more capabilities for those wanting AI on the GNOME desktop.
The Newelle 1.2 release introduces Llama.cpp integration as well as support for using its different back-ends from CPUs to the device-specific GPU back-ends and also the notable Vulkan support. There is also a new model library in place for ollama and llama.cpp usage.
For those wanting AI integration with your documents on your file-system, there is now a new hybrid search feature that better handles document reading.
Sure to be controversial as well is a new command execution tool for those trusting AI to run commands on your local system.
Some of the other work includes introducing tool groups, improved MCP server handling, semantic memory handler, chat import/export, and other features.
More details on this updated optional AI assistant for GNOME can be found via This Week in GNOME. Newelle 1.2 can be downloaded from Flathub.
