In the Uber Engineering Blog, Ian Guisard described uSpec, an agentic system designed to automate the creation of component design specifications. By leveraging AI agents and the open-source Figma Console Model Context Protocol (MCP), Uber has reduced the time required for detailed documentation from weeks to minutes.
The system functions as a “Visual-to-Technical Spec” compiler. An AI agent in the Cursor IDE connects to a local Figma Desktop session via a WebSocket bridge. The agent “crawls” the component tree, extracting data such as tokens and variant axes. A particularly time-consuming part of the Uber design process was ensuring designs had complete accessibility feature descriptions.
Uber maintains seven platform stacks and three accessibility frameworks across hundreds of pages, creating a combinatorial workload for every design change. Previously, Figma documents were manually translated into detailed technical specifications for each stack. uSpec automates this translation from visual design to technical contract.
The intelligence of the system resides in Agent Skills—structured Markdown files that encode Uber’s internal domain expertise. These skills include:
Platform-Specific Accessibility: Mapping a single visual button to its corresponding VoiceOver (iOS), TalkBack (Android), and ARIA (Web) semantic properties, and Density Logic: Calculating how padding and typography scale across Uber’s implementation stacks, including SwiftUI, React, and Android Compose.
Governance of proprietary design information is a priority for Uber. uSpec addresses this by keeping the pipeline local. Guisard noted,
“No cloud API, no proprietary design data leaving your network. At Uber, this is what makes AI-assisted documentation possible in the first place: nothing leaves your machine.”
By using the Figma Console MCP, proprietary design data remains on the local network. The agent reads and writes to the local Figma Desktop app rather than calling cloud-based design APIs.
uSpec is integrated into Michelangelo, Uber’s centralized AI platform. To maintain security, all agentic requests pass through the GenAI Gateway, a Go-based proxy that mirrors the OpenAI API. The gateway provides functions such as PII Redaction, scrubbing internal identifiers before requests reach external models like Claude 3.5 or GPT-4o.
Uber’s choice to maintain Figma as the primary artifact represents one side of an architectural divide in 2026. Discussions on platforms like Reddit (r/UXDesign) suggest that while enterprise scale requires a visual “source of truth” for cross-functional alignment, individual developers are increasingly using agentic workflows to jump from requirements directly to production-ready code.
User u/Bandos-AI commented on the shift:
“I am a hybrid between developer and UX designer. I leverage AI a ton when coding and lately I rarely use Figma anymore. It’s super easy to prompt your way into reusable UI components. React and similar front-end frameworks are especially good with AI as they are based on collections of isolated components.”
