Microsoft’s DevRel is excited to introduce AI Travel Agents, a sample application demo with enterprise functionality that demonstrates how developers can coordinate multiple AI agents and MCP servers (written in Java, .NET, Python and TypeScript) to explore travel planning scenarios. It’s built with LlamaIndex.TS for agent orchestration, Model Context Protocol (MCP) for structured tool interactions, Azure AI Foundry, GitHub Model and Azure Container Apps for scalable deployment.
TL;DR: Experience the power of MCP and Azure with The AI Travel Agents! Try out live demo locally on your computer to see real-time agent collaboration in action. Share your feedback on our community forum. We’re already planning enhancements, like new MCP-integrated agents, enabling secure communication between the AI agents and MCP servers, adding support for Agent2Agent over MCP. This is still a work in progress and we also welcome all kind of contributions. Please fork and star the repo to stay tuned for updates!
This sample application uses mock data and is intended for demonstration purposes rather than production use.
The Challenge: Scaling Personalized Travel Planning
Travel agencies grapple with complex tasks: analyzing diverse customer needs, recommending destinations, and crafting itineraries, all while integrating real-time data like trending spots or logistics. Traditional systems falter with latency, scalability, and coordination, leading to delays and frustrated clients. The AI Travel Agents tackles these issues with a technical trifecta:
- LlamaIndex.TS orchestrates six AI agents for efficient task handling.
- MCP equips agents with travel-specific data and tools.
- Azure Container Apps ensures scalable, serverless deployment.
This architecture delivers operational efficiency and personalized service at scale, transforming chaos into opportunity.
LlamaIndex.TS: Orchestrating AI Agents
The heart of The AI Travel Agents is LlamaIndex.TS, a powerful agentic framework that orchestrates multiple AI agents to handle travel planning tasks. Built on a Node.js backend, LlamaIndex.TS manages agent interactions in a seamless and intelligent manner:
- Task Delegation: The Triage Agent analyzes queries and routes them to specialized agents, like the Itinerary Planning Agent, ensuring efficient workflows.
- Agent Coordination: LlamaIndex.TS maintains context across interactions, enabling coherent responses for complex queries, such as multi-city trip plans.
- LLM Integration: Connects to Azure OpenAI, GitHub Models or any local LLM using Foundy Local for advanced AI capabilities.
LlamaIndex.TS’s modular design supports extensibility, allowing new agents to be added with ease. LlamaIndex.TS is the conductor, ensuring agents work in sync to deliver accurate, timely results. Its lightweight orchestration minimizes latency, making it ideal for real-time applications.
The Model Context Protocol (MCP) empowers AI agents by providing travel-specific data and tools, enhancing their functionality. MCP acts as a data and tool hub:
- Real-Time Data: Supplies up-to-date travel information, such as trending destinations or seasonal events, via the Web Search Agent using Bing Search.
- Tool Access: Connects agents to external tools, like the .NET-based customer queries analyzer for sentiment analysis, the Python-based itinerary planning for trip schedules or destination recommendation tools written in Java.
For example, when the Destination Recommendation Agent needs current travel trends, MCP delivers via the Web Search Agent. This modularity allows new tools to be integrated seamlessly, future-proofing the platform. MCP’s role is to enrich agent capabilities, leaving orchestration to LlamaIndex.TS.
Azure Container Apps: Scalability and Resilience
Azure Container Apps powers The AI Travel Agents sample application with a serverless, scalable platform for deploying microservices. It ensures the application handles varying workloads with ease:
- Dynamic Scaling: Automatically adjusts container instances based on demand, managing booking surges without downtime.
- Polyglot Microservices: Supports .NET (Customer Query), Python (Itinerary Planning), Java (Destination Recommandation) and Node.js services in isolated containers.
- Observability: Integrates tracing, metrics, and logging enabling real-time monitoring.
- Serverless Efficiency: Abstracts infrastructure, reducing costs and accelerating deployment.
Azure Container Apps’ global infrastructure delivers low-latency performance, critical for travel agencies serving clients worldwide.
The AI Agents: A Quick Look
While MCP and Azure Container Apps are the stars, they support a team of multiple AI agents that drive the application’s functionality. Built and orchestrated with Llamaindex.TS via MCP, these agents collaborate to handle travel planning tasks:
- Triage Agent: Directs queries to the right agent, leveraging MCP for task delegation.
- Customer Query Agent: Analyzes customer needs (emotions, intents), using .NET tools.
- Destination Recommendation Agent: Suggests tailored destinations, using Java.
- Itinerary Planning Agent: Crafts efficient itineraries, powered by Python.
- Web Search Agent: Fetches real-time data via Bing Search.
These agents rely on MCP’s real-time communication and Azure Container Apps’ scalability to deliver responsive, accurate results.
It’s worth noting though this sample application uses mock data for demonstration purpose. In real worl scenario, the application would communicate with an MCP server that is plugged in a real production travel API.
Try It Out
Try out live demo locally on your computer for free using Docker Model Runner / Ollama or Azure AI Foundry for more capable LLMs, to see real-time agent collaboration in action.
Conclusion
You can explore today the open-source project on GitHub, with setup and deployment instructions. Share your feedback on our community forum. We’re already planning enhancements, like new MCP-integrated agents, enabling secure communication between the AI agents and MCP servers, adding support for Agent2Agent over MCP.
This is still a work in progress and we also welcome all kind of contributions. Please fork and star the repo to stay tuned for updates!
We would love your feedback and continue the discussion in the Azure AI Discord https://aka.ms/AI/discord