Back to Blog
Industry Insights

What 1 Billion Daily 402 Responses Tell Us About AI Crawling

Cloudflare is sending over a billion "Payment Required" responses to AI bots every day. Most publishers have no idea what that number means for them.

Presenc AI Research Team

February 20, 20265 min read
What 1 Billion Daily 402 Responses Tell Us About AI Crawling

One billion. That is the number of HTTP 402 "Payment Required" responses Cloudflare sends to AI crawlers every single day. I keep staring at it.

To put that in perspective, there are about 200 million active websites on the internet. Cloudflare powers roughly 20% of them. And from that slice alone, AI bots are hitting paywalls a billion times a day. Every day.

If you publish content on the web and you have not thought about what this number means for you, now would be a good time.

A quick primer on HTTP 402

HTTP status code 402 has been in the HTTP specification since 1997. It was marked "reserved for future use," which in standards-committee language means "we think someone will need this eventually but we have no idea who." For nearly three decades, nobody built a widely adopted implementation. Payment infrastructure was too slow, too expensive, and too clunky to work at HTTP speed.

Cloudflare changed that with their Pay Per Crawl system, which launched in private beta in July 2025. The idea is simple: when an AI bot requests a page, the server can return a 402 instead of the content. That 402 says: "This costs money. Here is the price. Pay and you get the content."

A billion of these responses per day means AI bots are asking for content a billion times and getting told to pay. Whether they actually pay is a different question. Most do not. But the demand is there, and it is enormous.

What the bots are actually doing

We already know from crawl ratio data that AI bots take far more than they give back. Anthropic's ClaudeBot crawls about 38,000 pages for every single referral visitor it sends to the publisher. GPTBot's ratio is better but still extreme: 1,091 pages per visitor returned.

Old-school search engines crawled about 14 pages per visitor. That was the deal. Crawl my content, index it, send me traffic. Not a perfect exchange, but workable. The new math does not work. The bots take the content, summarize it inside their own interfaces, and the user never visits the source.

Those billion daily 402 responses are the first real attempt to put a gate on that flow. Before Pay Per Crawl, publishers had two choices: let the bots in or block them entirely with robots.txt. Blocking means your content disappears from AI responses altogether. Not great if AI is becoming how people find information.

The gap between demand and revenue

Here is the part that gets me. A billion requests per day should translate into real money for publishers. If even 10% of those requests converted to payments at $0.05 per page, that would be $5 million a day flowing to publishers. $1.8 billion a year.

The actual number is nowhere close to that. Pay Per Crawl is still in private beta. Most publishers have not enabled it. Most AI bots do not support payments yet. The infrastructure exists but adoption has barely started.

Meanwhile, AI content licensing deals are happening at the top end. Big publishers are cutting deals worth hundreds of millions with OpenAI, Google, and others. Over $2.9 billion has been committed in licensing fees by early 2025. But those deals only reach a handful of major publishers. The long tail, the thousands of mid-size and small publishers whose content AI bots crawl constantly, gets nothing.

What publishers are getting wrong

The publishers who have enabled crawl pricing are mostly doing it blind. They pick a flat rate, maybe $0.05 or $0.10 per page, and apply it across their entire domain. Same price for a breaking news article that AI engines cite constantly and a press release from 2019 that nobody references. Same price for GPTBot, which sends almost no referral traffic, and PerplexityBot, which actually links back to sources.

That is like charging the same price for a Super Bowl ad and a classified in a community newsletter. Both are "ads." They are not the same thing.

The problem is data. Publishers do not know which of their pages AI engines actually cite. They do not know which bots convert crawls into citations and which just absorb content silently. Without that information, pricing is guesswork.

What would change the math

The billion-402 number tells us the demand side is real. Bots want the content. The supply side, publishers making that content available at a price, is where things break down.

Three things need to happen for publishers to actually capture value from this demand:

First, per-page pricing. Not every page on your site has the same value to AI. A comprehensive medical guide that gets cited in ChatGPT responses weekly is worth more than a generic about page. Pricing should reflect that.

Second, per-bot pricing. Different bots behave differently. Some send referral traffic. Some cite your content with a link. Some just absorb it and never mention you. A bot that sends value back should pay less than one that only takes.

Third, crawl-to-citation intelligence. Publishers need to know: when GPTBot crawls my health section 50 times, how often does that result in a ChatGPT citation? If the answer is frequently, that content is worth more. If the answer is never, maybe charge less or block that specific bot.

Where micropayments come in

Traditional payment processors choke on transactions below about $0.50. The fees eat the payment. When you are charging pennies per page crawl across millions of requests, credit card rails do not work.

The x402 protocol, built by Coinbase and Cloudflare, solves this with stablecoin micropayments. Fractions of a cent per transaction. Over 100 million payments processed already, with an annualized run rate approaching $600 million. Settlement happens in seconds. No accounts, no invoices, no human involvement.

This is what makes per-page, per-bot, dynamic pricing technically possible. Without micropayment rails, you are stuck with flat monthly licensing deals or per-domain pricing. With them, every single crawl request can carry a different price.

What this means right now

If you run a publishing operation, even a modest one, the billion-402 number should be on your radar. Your content is being crawled by AI bots. Probably a lot. Probably more than you think. And right now, you are almost certainly getting paid nothing for it.

The infrastructure to change that is being built. Cloudflare has the access control layer. x402 has the payment rails. What is still missing is the intelligence layer that tells you what your content is worth to AI and helps you price it accordingly. That is what we are working on at Presenc AI.

A billion daily 402 responses is a billion daily proof points that AI wants publisher content. The question is whether publishers will capture any of that value, or just watch it flow past.

The short version

AI bots are hitting publisher paywalls over a billion times a day. Most publishers have not turned on crawl pricing. Those who have are pricing blind, same flat rate for every page, every bot. Per-page intelligence and micropayment rails exist now to change that. The publishers who figure this out first will have a real advantage.

Find out what AI crawlers are doing on your site.

Presenc AI tracks crawl behavior, citation outcomes, and content value across AI platforms. Start with a free audit.

Share this article:
#HTTP 402#AI Crawling#Cloudflare#Publisher Monetization