Dev Proxy has reached a significant milestone with the release of version 1.0, introducing a range of new features aimed at helping developers build more reliable AI-powered applications. As reported in the announcement, the update focuses on realistic simulation of language model behavior, advanced resource tracking, and improvements to integration tools.
This marks the first major version of Dev Proxy, which the team described as:
We’re excited to announce the first major version of Dev Proxy! Over the last few years, we shipped functionality that we believe helps developers build more robust apps. After the recent refactorings, we’ve reached what we believe is a solid foundation for our future work. That said, we keep improving our code base and are open to any changes. Moving forward, we’re going to use SemVer to communicate the scope of changes in each release. We’ll keep publishing regular releases and should we ship some breaking changes, we’ll clearly communicate what’s changed and how it affects you.
One of the most notable additions is the LanguageModelFailurePlugin, which enables developers to test how their applications respond to unpredictable AI output. As stated, the plugin can simulate 15 common failure types, including hallucinations, bias, misinterpretations, contradictory statements, and ambiguous responses. With a note that developers can also define custom failure scenarios, ensuring their systems are robust when handling unreliable AI-generated content.
Another key enhancement is the LanguageModelRateLimitingPlugin, which introduces token-based rate limiting simulation. This feature reflects how large language model providers enforce limits by allowing different thresholds for input and output tokens within configurable timeframes. According to the development team, this capability helps simulate realistic performance boundaries and budget constraints for AI integrations.
(Simulating exceeding a token limit for an LLM request, Source: Official Microsoft announcement)
The OpenAITelemetryPlugin has also been improved, now supporting token usage tracking from streamed responses. The plugin can generate detailed cost and usage summaries in Markdown, JSON, or plain text formats, enabling developers to better monitor testing activity and forecast production expenses.
OpenAPI specification generation has been refined through updates to the OpenApiSpecGeneratorPlugin. The enhancements include the ability to exclude response types—particularly useful for AI agents that ignore such metadata—and to capture default parameter values, improving the accuracy of automated API usage by AI tools.
(Generates OpenAPI spec in JSON format from the intercepted requests and responses, Source: Microsoft Documentation)
Alongside the main application, several Dev Proxy tools have been updated. The Dev Proxy Toolkit, a Visual Studio Code extension, now supports v1.0.0 schemas, includes new configuration snippets for language model plugins, adds JSONC support for diagnostics, and offers quick actions for setting essential configuration flags.
The Visual Studio Code Tasks integration has been enhanced to automatically start and stop Dev Proxy during debug sessions. GitHub Actions integration has been simplified for easier CI/CD workflows, and .NET Aspire extensions have been updated with .NET 8 support.
The release also features an improved Dev Proxy MCP Server, which provides coding agents with direct access to updated documentation, schemas, and best practice guidance for building configurations. According to the team, these additions have already resulted in improved outcomes when using AI-assisted configuration editing.
Additional changes include enhanced streaming response handling in Chrome DevTools, expanded authentication compatibility through custom OIDC metadata URLs, streamlined Linux installation defaults, improved configuration validation, and more resilient error handling.
For interested readers, full release notes with a complete list of features, improvements, and bug fixes are available in the official announcement.