What This Presentation Covers
This presentation makes the case that AI crawlers — GPTBot, ClaudeBot, PerplexityBot, Google-Extended — are no longer just scrapers. They are users with intent: answer, attribute, train. Websites that ignore them don't protect their content; they turn away the agents that now act on behalf of millions of humans.
The deck and video walk through the practical implications: what good crawler UX looks like, how to configure robots.txt correctly, and what the agentic web means for brands that depend on discovery.
The UX of an AI Crawler
Good UX for humans means fast, clear, and intuitive. Good UX for AI crawlers means three things:
- Predictable navigation — every important piece of content needs a permanent, crawlable URL. No session tokens, no infinite scroll. If a crawler can't reach it via a link, it doesn't exist.
- Honest metadata —
<link rel="canonical">,dateModifiedin schema,max-snippetdirectives. Without these, crawlers guess — and they guess wrong. - Rate-limit empathy — publish a clear
Crawl-delayin robots.txt. Treat crawlers with the same courtesy you'd give a human hitting refresh too fast.
robots.txt Configuration for AI Crawlers
The minimal correct configuration to allow all major AI crawlers:
Blocking any of these crawlers means your content cannot appear in that platform's responses. The risk of being over-blocked is far greater than the risk of being crawled.
The Curl Test
The fastest way to check whether your site passes the AI crawler test:
If the output is empty, full of [object Object], or missing your main text — your site fails. AI models cannot read your content and will not cite it. Fix the JavaScript rendering issue before anything else.
The Agentic Web
We are entering a state where most reads of your website won't be by humans. They'll be by AI assistants, summarisers, and answer engines acting as proxies for humans. The websites that thrive will build robots.txt with the same care as their styles.css — and write alt text not just for screen readers, but for vision-language models.
The presentation closes with a practical framework: from "block all bots" → to "identify friendly agents via verified user-agents"; from "design for 1080p" → to "design for text-only, no JavaScript"; from "GA4 tracks users" → to "server logs track crawler sessions".
UltraScout AI
See exactly how AI crawlers see your site
Full technical AEO audit, Zero Coverage detection, and AI Conversion Funnel analytics — all in one platform.
Read the full article: AI Crawlers Are the New Online Users →