Even with perfect robots.txt and meta tags, slow servers can block AI crawlers more effectively than any Disallow rule.
| Crawler | Timeout | Rate limit respect |
|---|---|---|
| GPTBot | ~3 seconds | Yes (Crawl-delay) |
| ClaudeBot | ~2 seconds | Yes |
| Google-Extended | ~5 seconds | Yes (via Search Console) |
| PerplexityBot | ~2 seconds | Yes |
If your server takes longer than these thresholds, the crawler abandons the request and may mark your site as "slow" — reducing future crawl frequency.
TTFB measures how quickly your server responds to a request. For AI crawlers:
You can use the Retry-After HTTP header to tell AI crawlers when to come back:
HTTP/1.1 429 Too Many Requests
Retry-After: 3600
This tells the bot to wait 1 hour before trying again. Useful for:
| Fix | Impact on TTFB | Difficulty |
|---|---|---|
| Use CDN (Cloudflare, AWS CloudFront) | 50-80% reduction | Low |
| Enable caching (Redis, Varnish) | 90%+ reduction after first hit | Medium |
| Optimise database queries | Varies | Medium-High |
| Upgrade hosting (dedicated/VPS) | 30-60% improvement | Low (cost) |
Crawl-delay: 1 in robots.txt (seconds)Retry-After headers for 429 responsesCheck your AI SEO Score — free, instant, no sign-up required
Get My Free AI SEO Score → View Full Audit Report