213: WHY GOOGLE & CHATGPT ARE IGNORING YOUR "DEAD" CONTENT WITH JONO ALDERSON

JONO ALDERSON
Independent Consultant
Technical SEO specialist and former Head of SEO at Yoast, now an independent consultant focused on how AI systems evaluate web content.
The threshold is gone. For two decades, digital marketing operated on a single premise: interrupt humans, drag them to your URL, and control the message. But in 2026, AI agents have fundamentally broken this model. They've already absorbed every generic blog post, every commodity piece of content, leaving most websites as functional corpses, technically online but invisible to the systems that now intermediate the web.
Jono Alderson, a technical SEO specialist who has been mapping this shift, argues that the situation is even more dire than it appears. These AI systems have developed what functions as an immune response, treating persuasive copy and marketing fluff as noise to be filtered out. The gap between what brands claim and what Reddit reviews reveal becomes an instant credibility penalty. The path forward isn't better meta tags or more content. It's upstream engineering: fixing your return policy, your customer service, your actual product, because machines can now see through the performance.
KEY TAKEAWAYS
- Stop producing commodity content. If an LLM has already memorized the facts you're publishing, your content has zero value to these systems.
- Audit for incoherence. The gap between your marketing claims and your Reddit reviews, Glassdoor ratings, and customer complaints is now a measurable credibility penalty.
- Invest in upstream engineering. Fixing your return policy, shipping speed, and customer service will impact your AI visibility more than any technical SEO work.
- Your brand is now an aggregation of everything said about you across the entire web, including that 2013 microsite you forgot about and YouTube videos of your billboards.
- Technical bloat is becoming fatal. AI agents won't rebuild Google's tolerant rendering infrastructure, so React-heavy single-page apps will increasingly fail to be parsed.
SHOW NOTES
The Threshold Has Collapsed
Every marketing strategy for the past century operated on one assumption: get humans across a threshold, whether a physical store entrance or a digital URL, and you control the message. Once they're on your site, you own the experience. You can shape the narrative, run CRO experiments, squeeze value. That era ended quietly, and most marketers haven't noticed yet.
AI agents now intermediate this relationship. They make decisions about what information reaches users without ever sending those users to your website. The whole discipline of conversion optimization assumed you'd have access to the audience. What happens when you don't?
Your Brand Is Everywhere You Forgot About
The concept of a website as a neat bundle of pages on a domain made sense for human consumption. But when ChatGPT trains on the entire web corpus, it doesn't understand that your 2013 Christmas microsite on a separate URL is no longer relevant. It doesn't know your blog subdomain is a separate entity you barely maintain.
Your brand, to these systems, is an aggregation of every statement ever made about you. That includes Reddit threads, YouTube videos where someone drove past your billboard, Glassdoor reviews from disgruntled employees, and the product listing you forgot to update. The smallest unit of consumption is no longer a URL. The boundaries we drew around our digital presence have dissolved.
This creates an attribution nightmare. The measurement frameworks marketers relied on, tying impressions to clicks to conversions, were always somewhat fictional. Now they're completely inadequate. Old-school advertising metrics like salience studies and panel surveys might be all that's left.
The Zombie Web
Consider every dentist website on the planet. They all have the same twelve pages, the same blog posts about teeth whitening, the same generic content. None of it ever said anything new. LLMs have already extracted every piece of value from this commodity content. It's been memorized, synthesized, and compressed into the model's knowledge.
What's left? A husk that functions as a digital brochure for the rare human who types in a URL directly. But as an input to AI systems, these sites are worthless. Why would Google or OpenAI send traffic to content that solves no new problems?
The Machine Immune System
Marketing copy is designed to persuade. It says "we have the best product" when the product is average. It claims "we love our customers" while support tickets go unanswered for weeks. This worked when humans were the primary audience, easily distracted and susceptible to emotional appeals.
AI systems process this differently. When a model encounters persuasive language that conflicts with signals from reviews, forums, and other sources, it treats that marketing copy as noise. Potentially even as a hallucination to be filtered out. The immune response activates.
The incoherence penalty is real. Machines can spot the gap between your homepage claims and your three-star average on consumer review sites. They aggregate sentiment across sources you don't control and probably don't monitor. That dissonance becomes a credibility penalty applied to everything you publish.
Why llms.txt Won't Save You
Some have proposed creating separate, agent-friendly versions of websites. Clean markdown files that give AI systems exactly what they need. The problem is obvious once stated: why would a machine trust a special file that differs from your public-facing content?
If your website says one thing and your llms.txt says something cleaner, that's another incoherence signal. You're essentially asking machines to trust your curated summary over the messy reality they can observe elsewhere. They won't.
Upstream Engineering Is the New SEO
The optimization that matters now happens far from your website. It's your return policy, your shipping speed, your customer service response times, your product quality. These operational realities generate the signals that AI systems actually trust.
Title tags and meta descriptions still exist, but they're cosmetic compared to fixing why customers complain on Reddit. The engineers and operations teams who handle logistics, fulfillment, and support are now doing more for your search visibility than your SEO team. That's a difficult organizational conversation, but it's the reality of 2026. The brands that survive will be the ones that stopped optimizing their marketing and started optimizing their actual business.
WATCH ON YOUTUBE
QUESTIONS ANSWERED
What is the threshold in digital marketing and why is it disappearing?
The threshold refers to the traditional marketing approach of dragging humans from external channels onto your website where you can control the message and influence their behavior. This era is ending because AI agents now intermediate between users and websites, making decisions about what content to surface without allowing direct access to audiences. The concept of getting people to cross over to your controlled environment is becoming obsolete.
Why are AI systems developing an immune system against marketing content?
AI agents and LLMs are learning to filter out marketing fluff, sales copy, and persuasive language as noise because these elements don't provide unique value. When most websites say the same things using marketing speak rather than providing distinct insights, AI systems treat this repetitive promotional content like an allergen to be rejected. The machines have already extracted valuable information from commodity blog posts, leaving marketing-heavy sites with nothing new to offer.
What is upstream engineering and how does it help websites survive AI filtering?
Upstream engineering means fixing the actual customer experience, return policies, logistics, and core business operations before optimizing how you describe them online. Instead of tweaking meta tags or writing better copy, businesses need to address the gap between their marketing promises and reality. AI systems can detect inconsistencies between what companies claim and what customers actually experience in reviews and social media.
How has brand identity changed in the age of AI and LLMs?
Your brand is no longer just what appears on your website or domain. AI systems understand your brand as an aggregation of everything said about you across Reddit, YouTube, old microsites, social media, and every corner of the web that has been ingested into training data. This means companies must consider their entire digital footprint rather than focusing solely on their owned properties.
What elements of traditional website optimization still matter for AI agents?
Clarity is the primary element that remains important when AI agents browse websites. Fabricated urgency, psychological nudges, and conversion rate optimization tactics designed for humans become irrelevant to machines. AI systems need clear, straightforward information rather than persuasive language or manipulative design elements that were effective for human visitors.
Why are most websites becoming zombies in the AI era?
Many websites are technically online but functionally dead to AI systems because they never provided distinct value in the first place. They contain the same generic content as thousands of other sites in their industry, and AI models have already extracted all useful information from these commodity blog posts and templated pages. Without unique insights or value, these sites become hollow husks that AI systems have no reason to reference or recommend.
RELATED ARTICLES
SELLING TO AI: THE COMPLETE GUIDE TO AGENTIC COMMERCE
Checkout is becoming a protocol, not a page. Here's how the Agentic Commerce Protocol, the Universal Commerce Protocol, and Shared Payment Tokens are turning AI agents into buyers, and what it means for your business.
CLOUDFLARE NOW SERVES YOUR WEBSITE AS MARKDOWN TO AI AGENTS
Cloudflare's new Markdown for Agents feature converts HTML to markdown on the fly when AI agents request it. An 80% token reduction, built into the CDN layer. Here's what it means for your website.
HOW AI AGENTS SEE YOUR WEBSITE (AND HOW TO BUILD FOR THEM)
AI agents don't see your website the way humans do. They read the accessibility tree. Here's how the major platforms actually perceive web pages, what the research says, and how to make your website agent-ready.
ENJOYING THIS EPISODE?
No Hacks explores how to optimize websites for AI agents, with weekly episodes featuring SEOs, developers, and AI researchers. Subscribe on your favorite platform.
Subscribe Now