All Articles
11 min read

THE FULLY NON-HUMAN WEB: NO ONE BUILDS THE PAGE, NO ONE VISITS IT

AI AgentsGoogleAgentic BrowsersAgentic CommerceMCPAAIOAXO
AUTHOR
Slobodan "Sani" Manic

SLOBODAN "SANI" MANIC

No Hacks

CXL-certified conversion specialist and WordPress Core Contributor helping companies optimise websites for both humans and AI agents.

In January 2026, Google was granted patent US12536233B1. Six engineers worked in it, and it describes a system that scores a landing page on conversion rate, bounce rate, and design quality. If the landing page falls below a threshold, generate an AI replacement personalized to the searcher. The advertiser never sees it. Never approves it. Might not even know it happened.

As we discussed in episode 220, the debate around this patent has centered on scope: is it limited to shopping ads, or does it signal something broader? That's the wrong question.

The right question: what happens when you combine AI-generated pages with AI agents that browse, shop, and transact on behalf of humans?

For the first time, we have the infrastructure for a web where no human creates the page and no human visits it. Both sides can be non-human. That changes everything.

Contents

The Supply Side: AI-Generated Pages

The supply side of the web has always been human. Someone designs a page, writes copy, publishes it. Three developments are changing that.

Google's patent US12536233B1 is the most direct: score a landing page on conversion rate, bounce rate, and design quality, then replace underperforming pages with AI-generated versions. The replacement pages draw on the searcher's full search history, previous queries, click behavior, location, and device data. Google builds personalized landing pages no advertiser can match, because no advertiser has access to cross-query behavioral data at that scale. Barry Schwartz covered the patent on Search Engine Land, describing a system where Google could automatically create custom landing pages replacing organic results. Glenn Gabe called Google's AI landing page patent potentially more controversial than AI Overviews. Roger Montti at Search Engine Journal argued the patent's scope is limited to shopping and ads. Both camps agree: the technology to score and replace landing pages with AI exists and works.

NLWeb, Microsoft's open project, takes a different approach. NLWeb turns any website into a natural language interface using existing Schema.org markup and RSS feeds. An AI agent querying an NLWeb-enabled site doesn't load a page at all. The agent asks a structured question, NLWeb returns a structured answer. The rendered page becomes optional.

WebMCP goes further still. With WebMCP, a website registers tools with defined input/output schemas that AI agents discover and call as functions. A product search becomes a function call. A checkout becomes an API request. WebMCP eliminates the "page" concept entirely, dissolving the web page as a unit of content into a set of callable capabilities.

Each mechanism works differently, but the direction is the same: the page is becoming something generated, queried, or bypassed entirely. The human-designed, human-published web page is no longer the only way content reaches an audience.

The Demand Side: AI Agents as Visitors

The demand side shifted faster. In 2024, bots surpassed human traffic for the first time in a decade, accounting for 51% of all web activity. Cloudflare's data shows AI "user action" crawling (agents actively doing things, not just indexing) grew 15x during 2025. Gartner predicts 40% of enterprise applications will feature task-specific AI agents by end of 2026, up from less than 5% in 2025. The scale is hard to overstate.

Agentic browsers are the most visible shift. Chrome's auto browse turned 3 billion Chrome installations into potential AI agent launchpads. Google's Gemini scrolls, clicks, fills forms, and completes multi-step tasks autonomously inside Chrome. Perplexity's Comet browser conducts deep research across multiple sites simultaneously. Microsoft's Edge Copilot Mode handles multi-step workflows from within the browser sidebar. The full agentic browser landscape now includes over a dozen consumer and developer tools, all browsing on behalf of humans.

Commerce agents have moved past browsing into buying. OpenAI launched Instant Checkout to let users purchase products directly inside ChatGPT, powered by Stripe's Agentic Commerce Protocol (ACP). OpenAI killed the feature in March 2026 after near-zero purchase conversions and only a dozen merchant integrations out of over a million promised. The failure was execution, not concept: Alibaba's Qwen app processed 120 million orders in six days in February 2026 because Alibaba owns the AI model, the marketplace, the payment rails (Alipay), and the logistics. OpenAI tried to replicate agentic commerce without owning the stack. Google and Shopify's Universal Commerce Protocol (UCP) connects over 20 companies, including Walmart, Target, and Mastercard, in a framework designed for AI agents to handle commerce from product discovery through checkout. Shopify auto-opted over a million merchants into agentic shopping experiences with ChatGPT, Copilot, and Perplexity. The transaction happens in an AI conversation. No checkout page loads.

Agent-to-agent communication removes the human from both ends. Google's Agent-to-Agent (A2A) protocol lets AI agents from different vendors discover each other's capabilities and collaborate on tasks without human mediation. A travel planning agent negotiates directly with a booking agent. A procurement agent evaluates supplier agents across vendors. Over 150 organizations support A2A, including Salesforce, SAP, and PayPal, making agent-to-agent commerce and coordination a production reality.

When Both Sides Go Non-Human

Until now, one side of the web was always human. A person built the page, or a person visited it. Usually both.

Google's patent closes the circuit.

Here's what a complete non-human flow might look like. A user tells their AI assistant they need running shoes. The assistant queries product data through NLWeb or WebMCP, no page load needed. The assistant evaluates options by checking inventory across retailers via A2A. If the user needs to review a comparison, Google generates a landing page personalized to that specific user's search history and preferences. The assistant completes checkout through ACP or UCP using Shared Payment Tokens. The user receives a confirmation.

The human's role in that entire flow: stating intent and approving the purchase. Discovery, page generation, product evaluation, and transaction completion are all handled by AI systems. The human touches only the two endpoints of the chain.

Every piece of technology in that chain exists in production today. Chrome auto browse is live for 3 billion Chrome users. A2A has 150+ organizational supporters. ACP underpins Stripe's agentic commerce infrastructure (ChatGPT's Instant Checkout failed on execution, not protocol). UCP connects Shopify, Google, Walmart, and Target. Patent US12536233B1 is granted. No single company has assembled the full loop yet, but every component is operational.

Who's Building the Non-Human Web

Here's where it gets interesting. Map out who's building what, and a pattern emerges:

LayerWhatWho
Page generationAI landing pagesGoogle
Content-as-APIWebMCP, NLWebGoogle, Microsoft
Agent infrastructureMCP, A2AAnthropic, Google
Agent browsersChrome, Comet, CopilotGoogle, Perplexity, Microsoft
Agent commerceACP, UCPStripe + OpenAI, Shopify + Google
Edge deliveryMarkdown for AgentsCloudflare

Google appears in five of six layers: page generation (patent US12536233B1), content-as-API (WebMCP), agent infrastructure (A2A), agent browsers (Chrome auto browse), and commerce (UCP). Google is positioning itself to mediate the non-human web the same way Google mediates the human one through Search.

The Agentic AI Foundation (AAIF), formed under the Linux Foundation with Anthropic, OpenAI, Google, and Microsoft as platinum members, provides the governance layer. The AAIF functions as the W3C for the agentic web: the vendor-neutral body that decides which protocols become standards for agent interoperability.

What Website Owners Need to Know

This isn't an optimization checklist. It's three structural shifts in what your website is for.

Your data layer is your website

Google's patent generates landing pages from product feed data, making product feeds the most important asset an e-commerce business maintains. NLWeb queries Schema.org markup instead of rendering pages, making structured markup the front door to your content. WebMCP exposes site capabilities as function calls, making tool definitions the user interface agents interact with.

Structured data, product feeds, JSON-LD, and API surfaces have traditionally been treated as backend infrastructure. In the non-human web, these data layers become the primary way a business reaches customers. Product feed accuracy (specs, pricing, stock levels, images) matters more than homepage design when AI systems generate the page from that feed.

Trust is the moat

AI can generate a page. It cannot generate a reason to seek you out by name.

Direct traffic, email subscribers, community members, and brand reputation persist when the page itself becomes replaceable. An AI agent can build a product page, but no AI agent can build the trust that makes a consumer (or their agent) request a specific brand by name.

The brands that matter in the non-human web are the ones people tell their agents to find. "Get me a fleece jacket" is a commodity query. "Get me a fleece jacket from Patagonia" is a brand moat.

The measurement problem

How do you measure a page you didn't build? How do you A/B test against something Google generates dynamically? How do you attribute a conversion that happened inside ChatGPT, initiated by an agent acting on behalf of a user who never saw your website?

Traditional web analytics (page views, sessions, bounce rate, time on site) assume two things: a human visitor and a page you control. On the non-human web, neither assumption holds. A Google-generated landing page isn't yours. A ChatGPT checkout session doesn't register in your analytics.

I don't have a clean answer here, and neither does anyone else. Measurement is the genuinely unsolved problem of the non-human web. New metrics will need to track agent discoverability, agent conversion rate, and data feed quality. But as of March 2026, the measurement infrastructure hasn't caught up to the technology it needs to measure.

Four Predictions for 2026-2027

Four things to watch over the next 12-18 months.

Google ships patent US12536233B1, or something like it. The technology for scoring and replacing landing pages exists. The business incentive exists. Google has a history of introducing features in ads first, then expanding (Google Shopping went from free to paid to essential). AI-generated landing pages will likely appear in shopping ads first, then broaden to other verticals. Landing page quality scores in Google Ads serve as the early warning system for which pages Google considers replaceable.

Agent traffic becomes measurable. Analytics platforms will need to distinguish human sessions from agent sessions. BrightEdge reports AI agents account for roughly 33% of organic search activity as of early 2026. WP Engine's traffic data shows 1 AI bot visit for every 31 human visits by Q4 2025, up from 1 per 200 at the start of that year. Agent traffic ratios will accelerate further as Chrome auto browse rolls out globally beyond the US. New metrics around agent conversion rate and agent discoverability will emerge from necessity.

The protocol stack consolidates. MCP, A2A, NLWeb, and WebMCP form a coherent stack covering tool access, agent communication, content querying, and browser-level integration. Expect more interoperability between these protocols and fewer competing standards. The Agentic AI Foundation (AAIF) accelerates consolidation. Within 18 months, "does your site support MCP?" will be as standard a question as "is your site mobile-friendly?"

Brand differentiation gets harder and more important. When AI generates pages and agents do the shopping, the only defensible position is being the brand people (and their agents) seek out by name. Direct relationships, owned audiences, trust signals. Everything else is a commodity.

The Web Splits in Two

When Shopify auto-opted merchants into agentic shopping, I asked whether your website just became optional. The answer is more nuanced than optional or essential. It's becoming something different.

The web isn't dying. It's splitting.

The transactional web (product listings, checkout flows, information retrieval, comparison shopping) is going non-human first. AI generates the landing pages. AI agents visit and transact on those pages. Humans approve decisions at the endpoints. Google's patent lives in the transactional web, and the economics of conversion optimization push hardest toward automation in this layer.

The experiential web (brand storytelling, community, content that rewards sustained attention, design that creates emotional response) stays human. Not because AI can't generate brand experiences, but because the value of those experiences comes from the human connection behind them. Nobody tells their agent to "go enjoy a brand experience on my behalf."

Your website's new job description: data source for the agents, trust anchor for the humans, brand home for both. The companies that treat their structured data, product feeds, and API surfaces with the same care they give their homepage design are the ones that show up in both worlds.

The non-human web isn't replacing the human web. It's growing alongside it. Your job is to show up in both.

QUESTIONS ANSWERED

What is the fully non-human web?

The fully non-human web describes a scenario where AI generates the page (supply side) and AI agents visit and transact on that page (demand side), with no human directly involved in either creating or consuming the content. Google's AI landing page patent, agentic browsers, and commerce protocols like ACP and UCP have created the infrastructure for this to happen.

What is Google's AI landing page patent?

Patent US12536233B1, granted to Google in January 2026, describes a system that scores landing pages on conversion rate, bounce rate, and design quality. Pages scoring below a threshold get replaced with AI-generated versions personalized to the individual searcher using their full search history, click behavior, location, and device data. The advertiser never approves or necessarily sees the replacement.

Which companies are building toward the non-human web?

Google appears across nearly every layer: page generation (AI landing pages), content-as-API (WebMCP, NLWeb co-development), agent browsers (Chrome auto browse), and commerce (UCP with Shopify). Other key players include Anthropic (MCP), Microsoft (NLWeb, Edge Copilot), Stripe and OpenAI (Agentic Commerce Protocol), Cloudflare (Markdown for Agents), and Perplexity (Comet browser).

Will my website become irrelevant?

Not irrelevant, but its job is changing. The transactional web (product listings, checkout flows, basic information retrieval) is going non-human first. Your website's enduring roles are as a data source (structured data, product feeds), a trust anchor (brand, reputation, direct relationships), and a brand home (the experience AI can't replicate). Sites that serve both humans and agents will thrive.

What should website owners do to prepare?

Three priorities: treat your structured data and product feeds with the same care as your homepage design (they're becoming your primary interface). Build direct audience relationships that don't depend on intermediaries (email, community, brand trust). Accept that your website's audience is splitting into humans who want experiences and agents who want data, and serve both.

NEW TO NO HACKS?

AI agents are becoming your next visitors. No Hacks is a weekly podcast exploring how to optimize websites for this new reality, with practical strategies from SEOs, developers, and AI researchers.

Subscribe Now