A text file at the root of a website that tells crawlers what to access. Critical for AI Discovery because it controls which AI systems can index your content.
A text file at the root of a website that tells crawlers what to access. Critical for AI Discovery because it controls which AI systems can index your content.
The most common AI Discovery technical error is blocking AI crawlers in robots.txt by accident.
This term is part of the canonical AI Discovery vocabulary, published in the Body of Knowledge alongside the Eight Discovery Signals and The Methodology Handbook. The full glossary contains 56 terms — the operative vocabulary for AI Discovery practitioners.
Practitioners certified through Certified™ are tested on the AI Discovery vocabulary. This glossary is the institutional reference.
The AI Discovery Audit™ delivers a 0–1000 score, an eight-signal breakdown, and a 90-day roadmap. From $800.
Get Your AI Visibility Audit →