Following last week’s updated Intel LLM-Scaler-vLLM release for helping advance vLLM usage on Intel Arc Graphics, LLM Scaler Omni is out with a new release today for that LLM-Scaler environment focused on image / voice / video generation using Omni Studio and Omni Serving modes.
LLM-Scaler-Omni 0.1.0-b5 is the new release that adds support for Python 3.12 and PyTorch 2.9 for delivering some additional performance benefits. The updated LLM Scaler Omni brings a number of ComfyUI upgrades including support for new models and workflows such as Qwen-Image-Layered, Qwen-Image-Edit-2511, Qwen-Image-2512, and HY-Motion. The ComfyUI upgrades also include support for ComfyUI-GGUF for enabling GGUF model usage.
LLM-Scaler-Omni 0.1.0-b5 also brings SGLang Diffusion updates including support for CacheDiT, Tensor Parallelism support for multi-XPU inference, and SGLD ComfyUI custom node support.
There are also updated code samples and other improvements with the Docker image of LLM-Scaler-Omni 0.1.0-b5. Downloads and more details on the new LLM Scaler Omni release for further advancing AI capabilities on Intel Arc Graphics Battlemage hardware via GitHub.
