AI Crawl Rate Limits & Server Performance

Even with perfect robots.txt and meta tags, slow servers can block AI crawlers more effectively than any Disallow rule.

How AI crawlers behave differently

CrawlerTimeoutRate limit respect
GPTBot~3 secondsYes (Crawl-delay)
ClaudeBot~2 secondsYes
Google-Extended~5 secondsYes (via Search Console)
PerplexityBot~2 secondsYes

If your server takes longer than these thresholds, the crawler abandons the request and may mark your site as "slow" — reducing future crawl frequency.

Critical metric: TTFB (Time To First Byte)

TTFB measures how quickly your server responds to a request. For AI crawlers:

The Retry-After header

You can use the Retry-After HTTP header to tell AI crawlers when to come back:

HTTP/1.1 429 Too Many Requests
Retry-After: 3600

This tells the bot to wait 1 hour before trying again. Useful for:

How to optimise for AI crawlers

FixImpact on TTFBDifficulty
Use CDN (Cloudflare, AWS CloudFront)50-80% reductionLow
Enable caching (Redis, Varnish)90%+ reduction after first hitMedium
Optimise database queriesVariesMedium-High
Upgrade hosting (dedicated/VPS)30-60% improvementLow (cost)

If AI crawlers are overwhelming your server

  1. Use Crawl-delay: 1 in robots.txt (seconds)
  2. Implement Retry-After headers for 429 responses
  3. Upgrade server resources

Check your AI SEO Score — free, instant, no sign-up required

Get My Free AI SEO Score → View Full Audit Report