The .NET team has released Dev Proxy version 0.28, introducing new capabilities to improve observability, plugin extensibility, and integration with AI models. A central feature of this release is the OpenAITelemetryPlugin, which, as reported, allows developers to track usage and estimated costs of OpenAI and Azure OpenAI language model requests within their applications.
The plugin intercepts requests and records details such as the model used, token counts (prompt, completion, and total), per-request cost estimates, and grouped summaries per model.
According to the announcement, this plugin supports deeper visibility into how applications interact with LLMs, which can be visualized using external tools like OpenLIT to understand usage patterns and optimize AI-related expenses.
The update also supports Microsoft’s Foundry Local, a high-performance local AI runtime stack introduced at the Build conference last month. Foundry Local enables developers to redirect cloud-based LLM calls to local environments, reducing cost and enabling offline development.
As stated, Dev Proxy can now be configured to use local models, quoting the following from the dev team:
Our initial tests show significant improvements using Phi-4 mini on Foundry Local compared to other models we’ve used in the past. We’re planning to integrate with Foundry Local by default, in the future versions of Dev Proxy.
To configure Dev Proxy with Foundry Local, developers can specify the local model and endpoint in the languageModel section of the proxy’s configuration file. This integration offers a cost-effective alternative for developers working with LLMs during local development.
Regarding the .NET Aspire users, a preview version of Dev Proxy extensions is now available. These extensions simplify integration with Aspire applications, allowing Dev Proxy to run either locally or via Docker with minimal setup. As reported, this enhancement improves portability and simplifies the configuration process for distributed development teams.
In addition, support for OpenAI payloads has been expanded. As stated, previously limited to text completions, Dev Proxy now includes support for a wider range of completion types, increasing compatibility with OpenAI APIs.
The release also brings enhancements to TypeSpec generation. In line with TypeSpec v1.0 updates, the plugin now supports improved PATCH operation generation, using MergePatchUpdate to clearly define merge patch behavior.
As noted in the release, Dev Proxy now supports JSONC (JSON with comments) across all configuration files. This addition enables developers to add inline documentation and annotations, which can aid in team collaboration and long-term maintenance.
Concurrency improvements have also been made in logging and mocking. These changes ensure that logs for parallel requests are grouped accurately, helping developers trace request behavior more effectively.
Two breaking changes are included in this release. First, the GraphConnectorNotificationPlugin has been removed, following the deprecation of Graph connector deployment via Microsoft Teams.
Furthermore, the –audience flag in the devproxy jwt create command has been renamed to –-audiences, while the shorthand alias -a remains unchanged.
The CRUD API plugin has been updated with improved CORS handling and consistent JSON responses, enhancing its reliability in client-side applications.
Finally, the Dev Proxy Toolkit for Visual Studio Code has been updated to version 0.24.0. This release introduces new snippets and commands, including support for the already mentioned OpenAITelemetryPlugin, also improved Dev Proxy Beta compatibility, and better process detection.
For interested readers, full release notes are available in the official repository, providing a complete overview of features, changes, and guidance for this version