If you’ve been using OpenClaw or OpenCode with a Claude Pro or Google AI Ultra subscription, there’s a good chance you’ve already hit the wall — or worse, you’ve already been banned.
Anthropic moved first, on January 9, 2026, deploying a quiet server-side change that blocked OAuth tokens from Free, Pro, and Max plans everywhere except Claude.ai and the Claude Code CLI. The error message is blunt:
“This credential is only authorized for use with Claude Code and cannot be used for other API requests.”
Google followed in mid-February, starting a ban wave targeting AI Ultra subscribers — the $250-a-month tier — who had been routing requests through OpenClaw’s Gemini OAuth integration. Accounts were suspended without warning, without refunds, and often without explanation. OpenClaw creator Peter Steinberger, writing on X, called it “pretty draconian from Google” and said he’d likely pull support for Gemini OAuth entirely. “Be careful out there if you use Antigravity,” he added.
For now, OpenAI is the holdout. ChatGPT OAuth still works with OpenClaw — which might have something to do with the fact that Steinberger joined OpenAI on February 14, with the project now backed by an open-source foundation with OpenAI involvement.
What Does OAuth Have to Do With It?
If you’ve ever clicked “Log in with Google” on a third-party site, you’ve used OAuth. It’s a standard way to delegate authentication: you don’t share your password, you grant a limited access token. Clean, secure, widely used.
The problem here isn’t OAuth itself. It’s that those tokens carry the permissions of a flat-rate consumer plan. A Claude Max subscription at $200/month was priced for human-paced use — the web interface, back-and-forth conversations, a few hundred queries a day. Tools like OpenClaw are something else entirely: autonomous agents that can burn through millions of tokens in a single afternoon. Some users reported that just saying “how are you?” to their OpenClaw agent chewed through 30,000 tokens or more, depending on session context. At that rate, one active user on a $200/month plan can generate costs that would normally require thousands of dollars in API credits.
Anthropic’s terms of service have technically prohibited this kind of use for about two years. Google’s have too. What changed in early 2026 is that the tools became too popular to ignore — and too expensive to tolerate.
A Turbulent Few Weeks for OpenClaw
For those not following the story: OpenClaw is an open-source autonomous AI agent built by Austrian developer Peter Steinberger. It runs locally on your machine and connects to LLMs through messaging apps — Signal, Telegram, Discord, WhatsApp. No new subscription, no new web app; your AI assistant lives in the chat you already use.
The project went from obscurity to viral in a matter of weeks. And with the attention came friction. In late January, Anthropic sent a trademark complaint over the original name “Clawdbot,” forcing a rename to “Moltbot” on January 27 — and then to “OpenClaw” three days later after further issues. Then the OAuth blocks hit. Then came the Google bans. Then Steinberger announced he was joining OpenAI on February 14. It was a month’s worth of drama compressed into about three weeks.
OpenCode — a developer-focused CLI with roughly 111,000 GitHub stars and an estimated 2.5 million monthly active developers — was the other major tool caught in the crossfire. Both tools still work. They just need a different AI engine now.
Where to Go From Here
None of this means OpenClaw, OpenCode, or tools like them are finished. It just means the era of powering autonomous agents on flat-rate subscription tokens is over. Here are the four realistic paths forward.
OpenAI Codex OAuth is the fastest swap if you already have a ChatGPT account. OpenAI explicitly permits third-party tools to use OAuth tokens, and OpenCode shipped Codex support within hours of the Anthropic ban. Sign in, select your API org, the tool generates a key, and you’re back up. It uses a PKCE flow with automatic token refresh — not a loophole, the documented policy. OpenAI actively wants this to work, and right now it does.
Anthropic’s pay-as-you-go API is the right path if you specifically need Claude. Keys from console.anthropic.com; Claude Opus runs $15/M input tokens and $75/M output, with Sonnet and Haiku considerably cheaper. For modest workloads, the per-token cost can actually come in below a Max subscription. For heavy agentic use, it gets expensive quickly — which is the original problem, now solved honestly. At least you can set hard spending limits in the console.
Ollama with local models is the option for anyone who wants to cut cloud costs entirely. Ollama runs an OpenAI-compatible API at localhost:11434, which means OpenClaw and OpenCode slot in with a single config change and no code modifications. The best local options right now: Qwen2.5-Coder-32B rivals GPT-4o on coding benchmarks with a 128K context window; DeepSeek Coder V2 uses a MoE architecture that’s fast and strong on multi-file tasks; DeepSeek V3 handles complex reasoning and open-ended problem solving well. Hardware reality: 16GB RAM for 7B models, 32GB or more for larger ones, and a decent GPU (NVIDIA RTX 3080 or better) or Apple Silicon to get useful speed. It takes more effort than pasting an API key, and local models still trail frontier cloud models on complex multi-step reasoning — but for high-volume or privacy-sensitive workloads, the economics are hard to beat.
Gemini’s free API tier is worth knowing about, but manage expectations. It’s a different product from the paid subscription OAuth that Google is now banning — the free tier isn’t affected. However, Google cut free-tier quotas by 50–80% in December 2025 after admitting they’d “inadvertently” left generous limits running too long. You’re looking at roughly 5–15 requests per minute and 100–1,000 requests per day. Fine for experimenting, not for autonomous agents running all day.
Side-by-Side
| Option | Cost | Privacy | Hardware | Best for |
|---|---|---|---|---|
| OpenAI Codex OAuth | ChatGPT subscription | Sent to OpenAI | None | Fastest drop-in swap |
| Anthropic API | Pay-per-token | Sent to Anthropic | None | Must-have-Claude workloads |
| Ollama + local LLMs | Free after hardware | Fully local | 16GB+ RAM, GPU recommended | High-volume or privacy-sensitive use |
| Gemini free tier | Free | Sent to Google | None | Light experimentation only |
The Bottom Line
OpenClaw isn’t dead — it just needs a different engine. The same goes for OpenCode and every other tool that was running on subscription credentials. Two of the three big providers have drawn the line, and the third has a clear business interest in keeping the party going for now.
Pick the option that matches your actual workload and budget. If the volume is low, Codex OAuth is the smoothest transition. If you need Claude specifically, pay-as-you-go is the honest answer. If the volume is high and privacy matters, running local models on your own hardware is worth the setup time.
If you’re evaluating AI tooling for your organization — LLM integrations, developer toolchains, internal automation — get in touch. We help IT teams cut through the noise and build something that actually fits.