Anthropic has upgraded Claude Sonnet 4 to support a context length of up to 1 million tokens, a fivefold increase over its previous limit. The feature, now in public beta, is accessible through the Anthropic API and Amazon Bedrock, with Google Cloud’s Vertex AI support expected soon.
The extended context window enables users to submit far larger datasets within a single request. For developers, this means the ability to load entire codebases—tens of thousands of files, including tests and documentation—while maintaining cross-file awareness. For researchers, it enables synthesis across dozens of lengthy documents, such as academic papers or legal contracts, without losing references along the way.
Anthropic says the change is particularly valuable for building context-aware agents that must operate across hundreds of tool calls and multi-step workflows. In practice, this allows developers to supply full API documentation, configuration files, and system histories while keeping interactions coherent over long sessions.
The technical implications are significant: larger windows reduce the need to repeatedly retrieve or re-embed content, making multi-day tasks more feasible. However, they also increase the computational load and can make responses less focused if not managed carefully. Some developers argue that scaling context alone is not enough.
On Hacker News, user aliljet noted:
Allowing me to flood the context window with my code base is great, but short of evals that look into how effective Sonnet stays on track, it is not clear if the value actually exists here.
Another commenter added:
A context window that contains everything makes focus harder. They have to get better at understanding the repo by asking the right questions.
A separate concern comes from adoption. On Reddit, one user shared:
I have seen hardly anyone test it out because of the high cost. Anthropic either has the highest compute costs known to the AI industry.
Others took a more pragmatic stance:
Even if it is expensive, it is still nice to have the option. Most people will continue to use the same strategies we have been using to manage context, but I am sure there will be instances (or people with money to burn) where having the larger context will be worth the cost.
The 1M token context window is currently available to Anthropic customers with higher-tier access, with broader rollout planned in the coming weeks. While the upgrade broadens the scope of what Claude can process in one pass, developers remain divided on whether scale alone addresses the practical challenges of context management.