213: WHY GOOGLE & CHATGPT ARE IGNORING YOUR "DEAD" CONTENT WITH JONO ALDERSON

JONO ALDERSON
Independent Consultant
Technical SEO specialist and former Head of SEO at Yoast, now an independent consultant focused on how AI systems evaluate web content.
The threshold is gone. For two decades, digital marketing operated on a single premise: interrupt humans, drag them to your URL, and control the message. But in 2026, AI agents have fundamentally broken this model. They've already absorbed every generic blog post, every commodity piece of content, leaving most websites as functional corpses, technically online but invisible to the systems that now intermediate the web.
Jono Alderson, a technical SEO specialist who has been mapping this shift, argues that the situation is even more dire than it appears. These AI systems have developed what functions as an immune response, treating persuasive copy and marketing fluff as noise to be filtered out. The gap between what brands claim and what Reddit reviews reveal becomes an instant credibility penalty. The path forward isn't better meta tags or more content. It's upstream engineering: fixing your return policy, your customer service, your actual product, because machines can now see through the performance.
KEY TAKEAWAYS
- Stop producing commodity content. If an LLM has already memorized the facts you're publishing, your content has zero value to these systems.
- Audit for incoherence. The gap between your marketing claims and your Reddit reviews, Glassdoor ratings, and customer complaints is now a measurable credibility penalty.
- Invest in upstream engineering. Fixing your return policy, shipping speed, and customer service will impact your AI visibility more than any technical SEO work.
- Your brand is now an aggregation of everything said about you across the entire web, including that 2013 microsite you forgot about and YouTube videos of your billboards.
- Technical bloat is becoming fatal. AI agents won't rebuild Google's tolerant rendering infrastructure, so React-heavy single-page apps will increasingly fail to be parsed.
SHOW NOTES
The Threshold Has Collapsed
Every marketing strategy for the past century operated on one assumption: get humans across a threshold, whether a physical store entrance or a digital URL, and you control the message. Once they're on your site, you own the experience. You can shape the narrative, run CRO experiments, squeeze value. That era ended quietly, and most marketers haven't noticed yet.
AI agents now intermediate this relationship. They make decisions about what information reaches users without ever sending those users to your website. The whole discipline of conversion optimization assumed you'd have access to the audience. What happens when you don't?
Your Brand Is Everywhere You Forgot About
The concept of a website as a neat bundle of pages on a domain made sense for human consumption. But when ChatGPT trains on the entire web corpus, it doesn't understand that your 2013 Christmas microsite on a separate URL is no longer relevant. It doesn't know your blog subdomain is a separate entity you barely maintain.
Your brand, to these systems, is an aggregation of every statement ever made about you. That includes Reddit threads, YouTube videos where someone drove past your billboard, Glassdoor reviews from disgruntled employees, and the product listing you forgot to update. The smallest unit of consumption is no longer a URL. The boundaries we drew around our digital presence have dissolved.
This creates an attribution nightmare. The measurement frameworks marketers relied on, tying impressions to clicks to conversions, were always somewhat fictional. Now they're completely inadequate. Old-school advertising metrics like salience studies and panel surveys might be all that's left.
The Zombie Web
Consider every dentist website on the planet. They all have the same twelve pages, the same blog posts about teeth whitening, the same generic content. None of it ever said anything new. LLMs have already extracted every piece of value from this commodity content. It's been memorized, synthesized, and compressed into the model's knowledge.
What's left? A husk that functions as a digital brochure for the rare human who types in a URL directly. But as an input to AI systems, these sites are worthless. Why would Google or OpenAI send traffic to content that solves no new problems?
The Machine Immune System
Marketing copy is designed to persuade. It says "we have the best product" when the product is average. It claims "we love our customers" while support tickets go unanswered for weeks. This worked when humans were the primary audience, easily distracted and susceptible to emotional appeals.
AI systems process this differently. When a model encounters persuasive language that conflicts with signals from reviews, forums, and other sources, it treats that marketing copy as noise. Potentially even as a hallucination to be filtered out. The immune response activates.
The incoherence penalty is real. Machines can spot the gap between your homepage claims and your three-star average on consumer review sites. They aggregate sentiment across sources you don't control and probably don't monitor. That dissonance becomes a credibility penalty applied to everything you publish.
Why llms.txt Won't Save You
Some have proposed creating separate, agent-friendly versions of websites. Clean markdown files that give AI systems exactly what they need. The problem is obvious once stated: why would a machine trust a special file that differs from your public-facing content?
If your website says one thing and your llms.txt says something cleaner, that's another incoherence signal. You're essentially asking machines to trust your curated summary over the messy reality they can observe elsewhere. They won't.
Upstream Engineering Is the New SEO
The optimization that matters now happens far from your website. It's your return policy, your shipping speed, your customer service response times, your product quality. These operational realities generate the signals that AI systems actually trust.
Title tags and meta descriptions still exist, but they're cosmetic compared to fixing why customers complain on Reddit. The engineers and operations teams who handle logistics, fulfillment, and support are now doing more for your search visibility than your SEO team. That's a difficult organizational conversation, but it's the reality of 2026. The brands that survive will be the ones that stopped optimizing their marketing and started optimizing their actual business.
WATCH ON YOUTUBE
QUESTIONS ANSWERED
RELATED ARTICLES
ANSWER ENGINE OPTIMIZATION: HOW TO GET YOUR CONTENT INTO AI RESPONSES
A practical guide to Answer Engine Optimization (AEO). How AI search engines parse content, what gets cited, and what Google, Microsoft, and OpenAI actually recommend.
FROM SEO AND CRO TO AGENTIC AI OPTIMIZATION (AAIO): WHY YOUR WEBSITE NEEDS TO SPEAK TO MACHINES
The evolution from SEO to AEO to AAIO, and why December 2025 marks the turning point for optimizing websites for AI agents.
THE AGENTIC BROWSER LANDSCAPE IN 2026: A COMPLETE GUIDE
The complete guide to AI-powered agentic browsers in 2026. Every browser, automation framework, and enterprise API from Chrome's auto browse to Claude for Chrome, and what they mean for your website.
ENJOYING THIS EPISODE?
No Hacks explores how to optimize websites for AI agents, with weekly episodes featuring SEOs, developers, and AI researchers. Subscribe on your favorite platform.
Subscribe Now