By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: Open Responses Specification Enables Unified Agentic LLM Workflows
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > News > Open Responses Specification Enables Unified Agentic LLM Workflows
News

Open Responses Specification Enables Unified Agentic LLM Workflows

News Room
Last updated: 2026/02/02 at 3:57 PM
News Room Published 2 February 2026
Share
Open Responses Specification Enables Unified Agentic LLM Workflows
SHARE

OpenAI has released Open Responses, an open specification to standardize agentic AI workflows and reduce API fragmentation. Supported by partners like Hugging Face and Vercel and local inference providers, the spec introduces unified standards for agentic loops, reasoning visibility, and internal versus external tool execution. It aims to enable developers to easily switch between proprietary models and open-source models without rewriting integration code.

The specification formalizes concepts such as items, reasoning visibility, and tool execution models, enabling model providers to manage multi-step agentic workflows (repeating cycles of reasoning, tool invocation, and reflection) within their infrastructure. This shift enables model providers to process complex workflows within their infrastructure and return final results in a single API request. Additionally, native support for multimodal inputs, streaming events, and cross-provider tool calling reduces the translation work when switching between frontier models and open-source alternatives.

The core concepts introduced in the specification are items, tool use, and agentic loop. An item is an atomic unit representing model input, output, tool invocations, or reasoning states, with examples including message, function_call, and reasoning types. Items are extensible, allowing providers to emit custom types beyond the specification. One notable item type is reasoning, which exposes model thought processes in a provider-controlled manner. The payload can include raw reasoning content, protected content, or summaries, giving developers visibility into how models reach conclusions while allowing providers to control disclosure.

Open Responses distinguishes between internal and external tools to define where the orchestration logic resides. Internal tools are executed directly within the provider’s infrastructure, allowing the model to autonomously manage the agentic loop. In this scenario, model providers can perform tasks like searching documents and summarizing findings before returning the final result in a single API round-trip. Conversely, external tools are executed within the developer’s application code. In this model, the model provider pauses to request a tool call, requiring the developer to handle the execution and return the output to the model to continue the loop.

image source: openresponses.org

The specification has seen early adoption from partners including Hugging Face, OpenRouter, Vercel, and local inference providers like LM Studio, Ollama and vLLM, enabling standardized agentic workflows on local machines.

The announcement has prompted discussion regarding vendor lock-in and ecosystem maturity. Rituraj Pramanik noted:

Building an “open” standard on top of OpenAI’s API is slightly ironic, but practical. The real nightmare is fragmentation; we waste so much time gluing different schemas together. If this spec stops me from writing another “wrapper for a wrapper” and makes model swapping painless, you are solving the single biggest headache in agentic development.

Other developers view the move as a signal of growing maturity in the LLM landscape. AI developer and educator Sam Witteveen predicts:

Expect frontier open model labs (Qwen, Kimi, DeepSeek) to train models compatible with BOTH Open Responses AND the Anthropic API. Ollama just announced Anthropic API compatibility too, meaning high-quality local models running with the ability to use Claude Code tools is not too far away. This could be a huge win for developers wanting to switch between proprietary and open models without rewriting their stack

The Open Responses specification, schema, and compliance test tool are now available at the project’s official website, and Hugging Face has released a demo application for developers to see the spec in action.

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article Ikea’s next cheap Bluetooth speaker is a playful purple mouse Ikea’s next cheap Bluetooth speaker is a playful purple mouse
Next Article Firefox 148 Ready With New Settings For AI Controls Firefox 148 Ready With New Settings For AI Controls
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

Need to Update Your Wi-Fi Router Settings? Here’s What to Do
Need to Update Your Wi-Fi Router Settings? Here’s What to Do
News
Bad Bunny wins Grammy album of the year ahead of Super Bowl halftime show
Bad Bunny wins Grammy album of the year ahead of Super Bowl halftime show
News
U.S. government urged to sever ties with Grok, Indonesia lifts ban on chatbot
U.S. government urged to sever ties with Grok, Indonesia lifts ban on chatbot
News
Will Elon Musk’s emails with Jeffrey Epstein derail his very important year?
Will Elon Musk’s emails with Jeffrey Epstein derail his very important year?
News

You Might also Like

Need to Update Your Wi-Fi Router Settings? Here’s What to Do
News

Need to Update Your Wi-Fi Router Settings? Here’s What to Do

11 Min Read
Bad Bunny wins Grammy album of the year ahead of Super Bowl halftime show
News

Bad Bunny wins Grammy album of the year ahead of Super Bowl halftime show

1 Min Read
U.S. government urged to sever ties with Grok, Indonesia lifts ban on chatbot
News

U.S. government urged to sever ties with Grok, Indonesia lifts ban on chatbot

3 Min Read
Will Elon Musk’s emails with Jeffrey Epstein derail his very important year?
News

Will Elon Musk’s emails with Jeffrey Epstein derail his very important year?

9 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?