Microsoft recently announced the preview release of Model Context Protocol (MCP) support within its Azure AI Foundry Agent Service. The service, which became publicly available in May, aims to significantly boost interoperability for Artificial Intelligence (AI) agents.
Generative AI agents are increasingly powerful, but their utility is often bottlenecked by the complexity of connecting them to enterprise data sources, workflows, and specialized knowledge bases. Traditionally, this has involved cumbersome processes, such as hand-rolling Azure Functions, managing OpenAPI specifications, or developing custom plugins for each backend system.
The MCP, initially proposed by Anthropic, addresses this challenge. It’s an open, JSON-RPC-based protocol that enables a “server” to publish tools (functions) and resources (context) once. Any compliant “client” (like the Foundry Agent Service) can then automatically discover and invoke these capabilities. Kapil Dhanger, a Principal Cloud Solutions Architect at Microsoft, aptly put it in a LinkedIn post, “USB-C for AI integrations,” promising a “connect once, integrate anywhere” experience. Abbas Ali Aloc, a Solution Architect, echoed this sentiment, commenting on LinkedIn that “MCP sounds like the Rosetta Stone for AI agents – excellent work in promoting interoperability!”
With this preview, Azure AI Foundry Agent Service becomes, according to the company’s documentation, a first-class MCP client – allowing developers to bring any remote MCP server – whether self-hosted or SaaS – and Azure AI Foundry will import its capabilities within seconds1, keep them updated automatically, and route calls through the service’s enterprise-grade security envelope. Moreover, this will drastically simplify the process of building and maintaining AI agents that can query systems of record, trigger workflows, or access specialized knowledge.
Key benefits of MCP support in Azure AI Foundry Agent Service include:
- Easy Integration that simplifies connecting with internal services and external APIs without writing and managing custom functions.
- Enhanced Enterprise Features that leverage Foundry Agent Service’s enterprise-ready capabilities, such as Bring Your Own thread storage.
- Streamlined Agent Development, which means actions and knowledge from connected MCP servers are automatically added to agents and kept up to date as functionality evolves, reducing development and maintenance time.
(Source: Microsoft Devblogs)
While Azure AI Foundry Agent Service offers a unified platform for AI agent development with built-in MCP support, Google Cloud and AWS also integrate MCP through their respective service portfolios. GCP’s Vertex AI Agent Builder, along with its Agent Development Kit and “MCP Toolbox for Databases,” provides a structured environment for connecting agents to data sources via MCP. Similarly, AWS, utilizing Amazon Bedrock Agents and a growing suite of open-source MCP servers for specific AWS services (such as ECS and EKS), allows developers to integrate real-time contextual information into tools like Amazon Q Developer.
Lastly, the announcement follows Microsoft Build 2025, where Satya Nadella emphasized an “open-by-design” AI ecosystem and the company’s partnership with Anthropic to establish MCP as a standard across Windows 11, GitHub, Copilot Studio, and Azure AI Foundry.