methodology / AI Readiness / #16
AI crawler robots.txt directives
live factor #16 · AI Readiness · scoring impl: implemented · weight 1.3%
What we measure
AI search engines respect robots.txt. Blocking them entirely hides you from AI answers; not addressing them at all is fine but means you have no explicit policy. Sites that ALLOW AI crawlers are more discoverable in AI search.
How to improve your score
Decide your policy. To be discoverable in AI search: don't block (or explicitly allow) `GPTBot`, `ClaudeBot`, `PerplexityBot`, `Google-Extended`, `CCBot`. To opt out: add `User-agent: GPTBot\nDisallow: /` etc.
Data source
Data source for this factor is not yet documented.
Scoring
Scoring formulas are versioned with the methodology. The current method (v1.1.0) maps raw measurements to pass, warn, fail. Factor weights determine how much each contributes to the composite — see the methodology index for the full table.
Version history
| Version | Change | Date |
|---|---|---|
| v1.1.0 | Factor introduced. Status: live. Scoring impl: implemented. | 2026-04-25 |