AI Runs on Trusted Data — And Publishers Hold the Keys to Its Future
Artificial intelligence is reshaping how people seek and consume information. But AI runs on trusted data, and trusted data originates with publishers.
Why the Interface Must Come Before the Economics
Gary Newcomb
CTO & Co-Founder, FetchRight
There is a growing urgency across publishers to answer a very reasonable question:
How do we get paid for AI?
Licensing deals. Blocking crawlers. Paywalls for model access. The conversation is quickly converging on monetization.
And yet, that focus may be misplaced.
Not because monetization doesn't matter. It absolutely does. But because we haven't yet solved the more fundamental problem: how AI systems interact with content in the first place.
Right now, we are trying to price something that has not been properly adapted to its new consumer.
A useful analogy is the shift to mobile.
When mobile traffic began to dominate, many companies initially resisted. They preserved desktop experiences, layered in ads, and tried to maintain existing monetization strategies. But those experiences did not translate. Pages were slow. Interfaces broke. Conversion dropped.
The companies that won were not the ones that protected pricing.
They were the ones that rebuilt the experience for the new form factor. Faster, simpler, purpose-built for how users actually interacted.
The lesson was not about monetization.
It was about fit.
AI systems are not just another traffic source. They are not users in any traditional sense.
All of the mechanisms publishers have spent years refining, including ad placement, engagement loops, behavioral targeting, and conversion funnels, are optimized for human attention.
AI does not have attention.
It has input requirements.
It consumes structured data, explicit context, and efficient representations of meaning. When we present it with a full web page, we are not delivering a product. We are delivering a container full of artifacts that were designed for an entirely different audience.
To compensate for this mismatch, we have built an entire layer of infrastructure.
AI systems scrape pages, strip out layout, attempt to isolate content, chunk it, embed it, retrieve it, and reconstruct meaning. This process is now so normalized that it is rarely questioned.
But it should be.
Because every step in that pipeline exists to recover structure and intent that were lost at the source.
This has real consequences. It increases token usage, drives up compute costs, adds latency, and introduces failure points. More importantly, it degrades fidelity. The further meaning is reconstructed downstream, the more it drifts from how it was originally expressed.
And yet, on top of all of this, we are trying to layer monetization.
Blocking AI crawlers or restricting access can feel like a way to preserve value.
In reality, it often accelerates irrelevance.
AI systems are already becoming a primary layer of discovery. They determine which sources are referenced, how brands are represented, and which content is surfaced in answers. If a publisher is not present in that ecosystem (not because their content lacks value, but because it is not accessible in a usable form) they are not being protected.
They are being excluded.
This is not a theoretical risk. It is already happening.
The core issue is not willingness to pay. It is that the product itself has not evolved.
Publishers are still offering web pages designed for human consumption, while AI systems require something fundamentally different: structured, context-aware, machine-readable representations of content.
Trying to monetize without addressing that gap is like insisting on selling a desktop experience in a mobile-first world.
Or more simply, trying to sell outdated goods to a new type of customer.
AI does not need a better price.
It needs a better interface.
There is a deeper strategic shift underway, one that mirrors the early days of search.
In that era, the winners were not the ones who resisted indexing. They were the ones who understood how to structure content so it could be discovered, interpreted, and ranked effectively.
We are entering a similar phase with generative systems.
Call it AEO, GEO, or something else entirely. The underlying dynamic is the same. AI systems are deciding which sources to trust and how information is synthesized. They are shaping visibility in ways that are both more abstract and more powerful than traditional search.
If your content is not optimized for that layer (not just accessible, but usable) then it is not participating in the system that is increasingly driving discovery.
And if you are not visible, there is nothing left to monetize.
Before monetization can be effective, the interaction model has to evolve.
Content needs to move closer to its source of truth in how it is structured and delivered. It needs to preserve intent, context, and meaning in ways that machines can consume directly, without reconstructing it downstream.
When that happens, several things change:
At that point, monetization is no longer speculative.
It becomes a negotiation grounded in utility.
The industry is right to focus on economics.
But economics follow structure.
Until we address how AI systems access and consume content, we will continue to optimize around inefficiency and struggle to capture the value we know exists.
We do not yet have a pricing problem.
We have an interface problem.
Fix that first.
Everything else follows.
Artificial intelligence is reshaping how people seek and consume information. But AI runs on trusted data, and trusted data originates with publishers.
Cloudflare's Pay-Per-Crawl model signals a turning point. We can either create fragmented paywalls or build sustainable, fair content marketplaces. The choice is ours.
Licensing, usage accounting, and content transformation are three fundamentally different responsibilities, but today they are blended together. Separating them creates a better architecture for publishers and AI platforms alike.