By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: For Best Results with LLMs, Use JSON Prompt Outputs | HackerNoon
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > Computing > For Best Results with LLMs, Use JSON Prompt Outputs | HackerNoon
Computing

For Best Results with LLMs, Use JSON Prompt Outputs | HackerNoon

News Room
Last updated: 2025/04/22 at 2:14 PM
News Room Published 22 April 2025
Share
SHARE

This is the fourth part of an ongoing series. See parts 1, 2, and 3.

AI Principle IV: Use Structured Prompt Outputs

There was a time, a long, long time ago, when LLM APIs had just come out and no one yet knew for sure how to properly interact with them. One of the top problems was extracting multiple outputs from a single prompt response. When LLMs didn’t consistently return JSON (and they failed often), you tried persuading the LLM to cooperate by using your best prompt engineering oratory.

Those were ancient times. Back then, we traveled on horseback and wrote prompts by candlelight, as electricity hadn’t yet been invented. Debugging prompts meant long nights spent squinting at parchment scrolls, hoping the model would return a list instead of a haiku. And if it failed, you had no choice but to sigh deeply, dip your quill in ink, and try again.

Ok, I made that last part up. But LLM APIs that couldn’t consistently return a JSON response were a real thing and caused loads of issues. It all began to change with structured outputs in November of 2023 – you could now use the OpenAI API to give you a formatted JSON. In 2024, OpenAI also added support for strict structured outputs, which fully guarantees a JSON return. Similar API enhancements have also been added by Anthropic and Google. The time for unstructured prompt outputs has passed, and we are never going back.

Benefits

Why is it better to use JSON-structured prompt outputs as opposed to other formats or inventing a custom format?

Reduced Error Rate

Modern LLMs are fine-tuned to output valid JSON when requested—it is rare for them to fail even with very complex responses. In addition, many platforms have software-level protections against incorrectly formatted outputs. For example, the OpenAI API throws an exception when a non-JSON is returned when in structured output strict mode.

If you use a custom format to return multiple output variables, you will not benefit from this fine-tuning, and the error rate will be much higher. Time will be spent re-engineering the prompt and adding retries.

Decoupled Prompts and Code

With a JSON output, it’s trivial to add another output field, and doing so shouldn’t break your existing code. This decouples adding fields to the prompt from changes to the code processing logic. Decoupling can save you time and effort, particularly in cases where prompts are loaded from outside Git; see Principle II: Load LLM Prompts Safely (If You Really Have To).

Simplified System

Is there a practical reason to use an output format without built-in platform support? It would be easier for both you and the subsequent code contributors to format responses using JSON. Don’t reinvent the wheel unless you have to.

When NOT to Use Structured Output

Single Field Output

If your prompt outputs a single field in response, there are no benefits to outputting a JSON. Or are there?

Single-variable responses today may become complex responses tomorrow. After spending hours turning one field output prompt into many field output prompts, I now use JSON by default even when only a single field is returned. This saves time later while adding minimal extra complexity upfront.

Even when the program logic doesn’t need multiple outputs, there are prompt engineering and debugging benefits to adding additional fields. Adding a field that provides an explanation for a response (or cites a source in the documentation) can often significantly improve prompt performance (1). It can also be logged as an explanation for the model’s decisions. Having the response be JSON from the start makes adding such a field far easier.

So even if your prompt has a single output variable, consider JSON format as an option.

Streaming Response

For applications in which latency is critical, streaming LLM endpoints are often used. These allow for parts of the response to be acted on before the entire response is received. This pattern doesn’t work well with JSON, so you should use a simple, stream-friendly format instead.

For example, if your prompt decides on the action taken by a video game character and the words that the character says, you can encode it as “ACTION|SPEECH_TO_READ” and then stream the response with a streaming API, such as OpenAI Streaming API. This will give you far better latency.

Example Output:

WAVE_AT_HERO|Hello, Adventurer! Welcome to my shop.

As soon as the action is received, the character begins waving, and text is output as it streams in.

JSON lines and other stream-friendly formats can also be used effectively.

Conclusion

Don’t reject the benefits of civilization – use JSON-structured prompt outputs. There are hardly any downsides and it will make your life much easier as LLMs are heavily optimized to return valid JSON responses. Consider using a JSON output even if the extracted data is currently a single field. For streaming endpoints, use JSON lines or a simple custom format.

If you’ve enjoyed this post, subscribe to the series for more.

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article Has your Pixel 9 gotten worse at recording video recently?
Next Article He Built Memecoin Factory Pump.Fun. Did He Make a Small Fortune Dumping His Own Shitcoins as a Teen?
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

Always losing your stuff? Save $19 on this 4-pack of Apple AirTags.
News
Smart Solutions for Digital Needs: The Rise of One-Off SEO Agencies
Gadget
iFlytek claims the latest version of its AI model is as powerful as GPT-4 Turbo on certain metrics · TechNode
Computing
HUAWEI Watch 5 launched with tech that turns a single tap into a full health scan
News

You Might also Like

Computing

iFlytek claims the latest version of its AI model is as powerful as GPT-4 Turbo on certain metrics · TechNode

1 Min Read
Computing

Chery and Huawei ensure “all-round way“ EV deliveries: spokesperson · TechNode

1 Min Read
Computing

miHoYo’s Genshin Impact records lowest complaint handling rate in Q4 2023 · TechNode

1 Min Read
Computing

Chinese AI firm unveils Sora-like video model, claims “major breakthrough” · TechNode

1 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?