Your Website’s Next Visitor Isn’t Human

TL;DR: AI agents are already browsing websites for human users, surfacing, recommending, and acting on what they find. Most sites are designed exclusively for human eyes and become invisible to agents. Agent Experience (AX) is the practice of designing for both. This post defines AX, distinguishes it from SEO and GEO, and gives a six-step practical checklist any site owner can apply now.

Who is this visitor you did not design for?

When I first optimised this site for AI engines, I realised the visitor parsing my pages wasn’t a person at all. It was a model, reading my HTML for structure and meaning, not mood or visuals. That moment challenged a core assumption I’d held for years: most websites are built for people, but AI agents are already browsing on behalf of users.

Every layout choice, every hero image, every hover state I’ve designed assumed a person at a screen. But agents don’t see any of that. They read structure. They judge clarity. They decide whether your site is even worth surfacing. If your site only speaks to humans, you’re already invisible to a growing share of your audience. This shift isn’t coming someday, it’s happening right now.

What does Agent Experience (AX) mean?

Agent Experience is the practice of designing digital experiences for AI agents alongside human visitors. The term was coined by Mathias Biilmann, CEO of Netlify, on 28 January 2025, and the canonical resource for the discipline is agentexperience.ax. AX is the next layer above SEO and GEO. It treats agents as a real audience, not a side effect of search.

AX goes further than Generative Engine Optimisation. GEO is about being cited in AI search answers—so your content appears when someone asks ChatGPT, Perplexity, or Google’s AI Overviews a question. But AX covers every AI agent that visits, reads, evaluates, or acts on your site. That includes agents booking appointments, comparing services, or building shortlists for users.

The design patterns we set now will shape the next decade. This isn’t a passing trend, it’s an evidence-based discipline. In my experience, the organisations that measure how agents interact with their sites consistently outperform those that rely on guesswork.

Why is AX not just SEO with a new name?

Every few years, someone rebrands SEO and calls it revolutionary. But AX is genuinely different, and I’ve seen three concrete reasons why. Search crawlers read and rank. AI agents read, evaluate, and act. That changes the design brief entirely. A site that’s findable isn’t the same as one that’s usable by a non-human visitor.

  1. Agents act, they don’t just index. Search crawlers might list your page in a results set, but AI agents actually do something with it. They book, purchase, shortlist, and recommend. Your site has to be usable by machines, not just findable.

  2. Structure trumps style. A beautiful hero image means nothing to an agent. Semantic HTML, schema markup, and machine-readable content are the new design currency. Cloudflare now blocks AI scrapers and crawlers by default for new domains, a policy effective 1 July 2025. If you have not checked your settings, your AI traffic may already be shut off at the door.

  3. The layering is SEO, then GEO, then AX. SEO gets you found by search engines. GEO gets you cited by AI engines. AX gets you used by AI agents. Each layer builds on the last. The Agentic AI Optimisation (AAIO) framework, formalised by Floridi et al. and published in Minds and Machines in January 2026, captures this shift.

What needs to change on your site?

Here are the six actions you can take, in order. None requires a full redesign, but all are deliberate for clarity. The goal isn’t to optimise for agents at the expense of people. It’s to make your business clear enough that both audiences can understand it.

  1. Audit your structured data. Schema markup is no longer optional. For personal brands and consultants, Person schema outperforms Organisation schema. For products and services, implement Product, Service, and FAQ schemas.

  2. Check your AI crawler access. Many sites block AI bots without realising it. Review robots.txt and your Cloudflare or CDN settings. If agents cannot read your site, nothing else on this list matters.

  3. Create an llms.txt file. The emerging standard helps AI systems understand your site structure. Think of it as a sitemap for agents, the rails that tell them where to look first.

  4. Prioritise server-side rendering.AI crawlers rarely execute JavaScript. Analysis of GPTBot, ClaudeBot, and PerplexityBot fetches confirms they only see the initial HTML. If your key content loads client-side, agents see an empty page where a human sees a product grid.

  5. Write for two audiences. Every piece of content should work for a human reading naturally and an agent parsing for structure. This is not dumbing down. It is being deliberately clear, the kind of clarity a good editor has always demanded.

  6. Measure what agents see. Do not assume. Test how AI models actually interpret your pages. Ask the models, check the citations, compare against a human read. Evidence over instinct applies here as much as anywhere else.

The principle behind all six steps: design for agents on purpose, not by accident.

How does experimentation thinking apply to AX?

This is the same instinct experimentation programmes have always run on. Test what visitors actually do, not what you assume they’ll do. AX is that same discipline, now applied to a new kind of visitor. The skills transfer cleanly, hypothesis framing, controlled comparisons, and measurement before commitment.

Over a decade running experimentation programmes for organisations like Vitality, NatWest, RNLI, and Farrow & Ball, the discipline that mattered most was not statistical. It was the habit of measuring assumptions and acting on what the evidence showed. At the RNLI, a redesigned donation journey lifted donations by 28%. That result came from testing hypotheses, not redesigning by taste.

That same instinct catches where agents aren’t getting what they need. I test how they read the site, ask the models what they see. Compare that against the version a human reads. The instinct to test, measure, and iterate doesn’t switch off just because the visitor has changed. The context is new, but the discipline is the same. My About page tells more of that story.

What should you do before your next redesign?

Before you commit budget, I recommend three diagnostic questions. Are we designing for the visitor we have, or the one we assume? Can we prove agents can use the site before we approve the brief? What’s the fastest way to learn what agents actually see? If the answers to that are thin, that’s the gap to fill first.

The next time you brief a redesign, a content strategy, or a site audit, ask if the work stands up for both audiences. The organisations that lay the rails now, building structured, semantically clear, agent-accessible sites by design, are the ones agents will recommend, reference, and route users toward. The rest will learn the hard way: invisible to AI means invisible, full stop.

If those diagnostic questions reveal gaps you can’t answer, I built the AI Visibility Scorecard to find them. It runs three live AI-engine tests against three named competitors, audits 10 to 20 pages, and scores across five buyer-facing dimensions: Findability, Understandability, Trustworthiness, Agent-Readiness, and Citability. The result is a prioritised next-action list, not just another generic checklist.

Frequently Asked Questions

What is the difference between SEO, GEO, and AX?

SEO optimises for ranking in search engine results. GEO optimises for being cited inside AI-generated answers from ChatGPT, Perplexity, Google AI Overviews, and Gemini. AX optimises for AI agents that browse, evaluate, and act on a site. The three layers stack: search engines find you, AI engines cite you, AI agents use you.

Are AI agents actually visiting websites yet?

Yes. AI agents from search assistants, shopping assistants, and personal AI tools are already crawling and parsing websites on behalf of users. Many sites unknowingly block them at the CDN or robots.txt layer. Even the ones that allow access often present content the agents cannot read because it is locked behind client-side JavaScript.

Do I need to choose between human users and AI agents?

No. The point of AX is to design for both. Clear semantic HTML, named entities, structured data, and answer-first content help humans and agents at the same time. The two audiences want different things from the same page, and the same disciplines satisfy both.

What is llms.txt and do I need one?

llms.txt is an emerging plain-text/Markdown file at the root of a site that gives AI systems a structured map of your most important content, similar to robots.txt or sitemap.xml. The standard was proposed by Jeremy Howard ofAnswer.AI in 2024. For most sites, drafting one takes less than an hour and is one of the highest-leverage AX tasks available right now.

Will AI agents replace search engine traffic?

They will not replace it, but they will sit on top of it. Many human users will not see search results directly, only the answer their AI assistant returns. That is why being cited inside AI answers and being usable by AI agents are now two separate disciplines, both worth investing in.

How do I tell if my site is AX-ready?

Start with three checks. Open your site in a private window with JavaScript disabled and see how much of the page still reads. Run the homepage through an AI engine and ask it what the business does. Look at robots.txt and your CDN settings to confirm AI bots are allowed. If any of those return a thin answer, your AX work starts there.

Next
Next

From Mega-Prompt to Multi-Agent.