What is AI traffic & how can you track it?

Multiple Authors

Sep 29, 2025

TL;DR

  • AI traffic refers to bots & crawlers like GPTBot or ClaudeBot consuming publisher content to fuel AI assistants.

  • Google Analytics is blind to these bots. These visits don’t behave like humans, or your traditional bot that would try to pretend to be a human.

  • It’s already costing publishers over $240M in bandwidth costs & close to $47 billion in missed advertising revenue.

  • Tracking it is step one to protecting margins and unlocking new revenue opportunities in the agentic web.

The Invisible Audience Problem

Talk to a publisher (website owner) today and you’ll hear the same story: traffic from Google & Facebook is softening, revenues from advertising are falling, but the bills for hosting and bandwidth are rising.

Something doesn’t add up.

Less traffic from key sources.

Less advertising being delivered on site.

But higher costs for hosting traffic.

The answer: there are more bots on the internet than ever before.

Not just any bots, but AI bots. GPTBot, ClaudeBot, PerplexityBot, Google-Extended; they’re all sweeping across publisher sites every day. They’re pulling down articles, indexing content, and feeding it into models that millions of people now use as their default source of information.

The problem is that this “invisible audience” never shows up in your dashboards. Google Analytics doesn’t count them. Adobe won’t show you their journeys. And yet, you’re paying to serve them, and in many cases, they’re influencing consumer decisions in ways you can’t measure.

Why though AI Traffic Matters

The instinct might be to shrug this off as a technical footnote, something for your infrastructure team to worry about. But AI traffic is a strategic issue & opportunity for your business.

Every request from an AI bot is a page load you’re paying for without the benefit of an ad impression. When Cloudflare says that more than 50% of global traffic is automated, that translates into millions of “invisible visits” for even a mid-sized publisher.

That’s not background noise.

It’s eating away at your margin,

at your advertising business,

and it has a very important role to play in how you get seen by a new wave of consumers.

Your content is being surfaced in AI answers, sometimes prominently, sometimes not at all, and you have no way of knowing when or why. For years, publishers built entire businesses around optimising for Google search visibility. Now, a similar shift is happening inside AI assistants, only this time the feedback loop is broken. You can’t improve what you can’t measure.

And then there’s the bigger question of value. If your content is shaping what people hear from AI platforms but the click never comes back, who benefits? The audience is real, but the economics are flowing elsewhere.

Making the Invisible Visible

Right now, most publishers only have blunt tools to work with:

  1. Server logs will show you a trail of requests from user-agents like GPTBot or ClaudeBot. Sometimes the bots identify themselves, sometimes they don’t.

  2. CDNs and security providers can segment out automated traffic, but they rarely distinguish between AI crawlers and other types of bots. You can spot unusual spikes, hundreds of hits in a short window, that bounce within 1 millie second, but that’s more guesswork than strategy.

The point isn’t that these methods are perfect. It’s that even the imperfect signals already tell a clear story: AI bots are here, they’re active, and they’re large enough to move the numbers in your business. The real challenge is to get beyond spotting patterns and into proper analytics. Publishers need to see, in plain terms, who’s consuming their content, at what scale, and to what effect.

Why This Can’t Be Ignored

We’ve seen this movie before.

When SEO first emerged, many publishers dismissed it as a distraction. Within a few years, it became one of the most important levers in digital media. The same shift is happening now with AI. The difference is that this time, publishers don’t have the luxury of waiting for years to adapt.

The “agentic web” is already here. More people are starting their journey with AI assistants instead of search engines. The traffic that used to drive your advertising business is being displaced. And the first step toward adapting isn’t a commercial negotiation, or a licensing deal, or a product rethink.

It’s measurement.

Because if you can’t see this audience, you can’t protect your margins, and you certainly can’t capture the opportunity.

Final Thought

AI bots are already one of your biggest audiences.

The tragedy is that for most publishers, they’re invisible. But invisibility doesn’t make them harmless.

It makes them dangerous.

The publishers who act now, who start measuring AI traffic with the same energy they once applied to search, will be the ones in control of their future. Everyone else will simply be feeding the machine.