Anthropic is increasing Claude Code usage limits — here’s everything you need to know

The new deal will help Anthropic increase Claude Code usage limits, and API rate limits for Claude Opus models

Logo and branding of Anthropic's Claude AI tool pictured on a smartphone screen, with branding blending into background.
(Image credit: Getty Images)

Anthropic is joining forces with SpaceX to increase the AI developer's compute capacity – and that means Claude Code usage limits are set for a huge boost.

As part of the deal, SpaceX will permit Anthropic to use all of the compute capacity at its Colossus 1 data center in Tennessee, originally built for Musk's own AI firm, xAI.

"This gives us access to more than 300 megawatts of new capacity (over 220,000 NVIDIA GPUs) within the month," Anthropic said in a blog post.

That increase allowed Anthropic to raise usage limits, removing some peak-time restrictions, and boosting rate limits for APIs.

Latest Videos From

"This, along with our other recent compute deals, means that we’ve been able to increase our usage limits for Claude Code and the Claude API," the statement added.

Claude Code usage limits explained

Anthropic is making three key changes to usage limits, all targeted at making life easier for its “most dedicated customers".

To start, the company is doubling Claude Code's five-hour rate limits, meaning users will be able to make more prompts and write more code in each five-hour rolling session.

That applies to Pro, Max, Team and seat-based Enterprise plans, the company confirmed.

Next, Anthropic is removing its peak hours limit reduction on Claude Code, allowing Pro and Max users to operate the same across peak and off-peak times.

Elsewhere, Anthropic is raising its API rate limits for Claude Opus models by more than an order of magnitude. For example, tier 1 users previously had 30,000 maximum input tokens per minute, and will now have 500,000.

The move by Anthropic follows other AI companies tightening up on token usage. Last week, for example, GitHub changed its pricing model for Copilot to focus on consumption rather than credits.

GitHub said the move comes in direct response to skyrocketing compute and inference demands over the last year.

Rapid compute expansion

Anthropic pointed to a range of recent announcements designed to expand compute capacity. The AI developer has signed deals with Amazon, Google, Broadcom, Microsoft, and Fluidstack, all aimed at upping capacity to contend with skyrocketing user demands.

"We train and run Claude on a range of AI hardware — AWS Trainium, Google TPUs, and Nvidia GPUs — and continue to explore opportunities to bring additional capacity online," Anthropic added.

Anthropic revealed it is also working to ensure customers outside the US had access to local infrastructure, saying some of its capacity expansion would be international. In particular, the deal with Amazon would include "additional inference" in Asia and Europe.

"Our enterprise customers — particularly those in regulated industries like financial services, healthcare, and government — increasingly need in-region infrastructure to meet compliance and data residency requirements," Anthropic said.

In the US, Anthropic and other AI companies agreed to cover the cost of consumer electricity prices caused by their data centre rollouts. Anthropic said it was looking for ways to "extend that commitment to new jurisdictions".

"We’re very intentional about where we’ll add capacity — partnering with democratic countries whose legal and regulatory frameworks support investments of this scale, and where the supply chain on which our compute depends — hardware, networking, and facilities — will be secure.”

FOLLOW US ON SOCIAL MEDIA

Follow ITPro on Google News and add us as a preferred source to keep tabs on all our latest news, analysis, views, and reviews.

You can also follow ITPro on LinkedIn, X, Facebook, and BlueSky.

Freelance journalist Nicole Kobie first started writing for ITPro in 2007, with bylines in New Scientist, Wired, PC Pro and many more.

Nicole the author of a book about the history of technology, The Long History of the Future.