Docker Inc. today announced the launch of major new capabilities designed to make it dramatically easier to build, run and deploy agentic AI applications.
The company, which helps ease the building, testing and deploying of applications by packaging them in lightweight, portable software containers, said it’s extending its Docker Compose tool to support AI agents and AI models so that developers can deploy them at large scale with ease. The company is also introducing Docker Offload to allow developers to cloud-scale AI models and collaborating with integration partners such as Google Cloud, Microsoft Azure and numerous AI software development kit providers.
“Agentic applications are rapidly evolving, but building production-grade agentic systems is still too hard,” said Tushar Jain, executive vice president of engineering at Docker. “We’re now making agentic apps accessible to every developer by making agent-based development as easy, secure, and repeatable as container-based app development has always been.”
Agentic AI is part of a new wave of AI software that involves using large language models to power tools to act autonomously and achieve complex goals with minimal human oversight. Unlike traditional AI chatbots, which often rely on direct interaction, such as question and answer, AI agents can make decisions, plan actions and adapt to changing circumstances to complete step-by-step objectives using problem-solving.
Docker Compose has been one of Docker’s go-to tools for developers for running applications that exist across multiple containers — standardized software packages that include everything to run an application, including code, runtime, system tools, libraries and configuration.
The company said it’s extending Compose to address the challenges of the agentic era by allowing developers to define agentic architectures consisting of AI models and tools needed to take them into production. These include defining agents, models and tools in a single Compose file. Developers can also run agentic workloads locally or deploy them seamlessly to cloud services.
Compose allows developers to connect securely with Docker’s Model Context Protocol Gateway, facilitating communication and discovery of other AI tools and data services. This protocol enables developers to integrate large language models and AI applications with data services without having to rewrite code or create complex application interfaces.
“Expanding Docker Compose to give developers the same familiar, simple experience for AI deployments as they have for traditional apps is exactly what we need,” said Torsten Volk, principal analyst at Enterprise Strategy Group. “Plus, the new capability to run AI models directly in the cloud — without clogging up your laptop — is another major step forward. This should make a real difference in how quickly enterprises can start adopting AI at scale.”
Docker introduces Offload
Agentic AI applications demand far more graphics processing unit power than standard AI model usage, as they complete complex tasks. Local machines often fall behind in necessary capacity, causing sloggingly slow outcomes.
To address this challenge, Docker today unveiled Docker Offload in beta mode. This new service allows developers to offload AI and GPU-intensive workloads to the cloud whenever they need.
According to Docker, the new service allows developers to maintain local speed and access the cloud only when needed; large models and multi-agent systems can be offloaded to high-performance cloud environments. Developers can choose where and when to offload workloads based on privacy, cost and performance needs.
The new Offload capability integrates directly into Docker Desktop, making it easy to tap into and access with ready configuration options.
Integration partners for this cloud availability include Google Cloud via serverless environments and Microsoft Azure, coming soon. Compose integrations also support popular agentic AI frameworks, including CrewAI, Embabel, Google’s Agent Development Kit, LangGraph, Spring AI and Vercel AI SDK.
Image: News/Microsoft Designer
Support our open free content by sharing and engaging with our content and community.
Join theCUBE Alumni Trust Network
Where Technology Leaders Connect, Share Intelligence & Create Opportunities
11.4k+
CUBE Alumni Network
C-level and Technical
Domain Experts
Connect with 11,413+ industry leaders from our network of tech and business leaders forming a unique trusted network effect.
News Media is a recognized leader in digital media innovation serving innovative audiences and brands, bringing together cutting-edge technology, influential content, strategic insights and real-time audience engagement. As the parent company of News, theCUBE Network, theCUBE Research, CUBE365, theCUBE AI and theCUBE SuperStudios — such as those established in Silicon Valley and the New York Stock Exchange (NYSE) — News Media operates at the intersection of media, technology, and AI. .
Founded by tech visionaries John Furrier and Dave Vellante, News Media has built a powerful ecosystem of industry-leading digital media brands, with a reach of 15+ million elite tech professionals. The company’s new, proprietary theCUBE AI Video cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.