llms.txt

A standardized markdown file at a website's root that tells AI models which content is most important and where to find clean, machine-readable versions of key pages.

What is llms.txt?

llms.txt is a standardized markdown file hosted at a website’s root path (e.g., https://example.com/llms.txt) that serves as a curated index for large language models. It provides concise summaries of the site’s purpose, critical contextual details, and prioritized links to machine-readable resources.

Think of it as the third layer alongside existing web files:

  • robots.txt — Explains what crawlers may or may not access
  • sitemap.xml — Lists URLs for indexing
  • llms.txt — Tells LLMs which content is most important and where to find clean, structured versions of it

How llms.txt Works

A typical llms.txt file follows a strict markdown schema that includes:

  • An H1 header for the site’s name
  • A blockquote summarizing the site’s purpose
  • H2-delimited sections grouping content into categories (Docs, Policies, Support, Product)
  • Links to clean markdown versions of important pages
  • An optional section flagging secondary links that can be omitted when context length is constrained

Why llms.txt Matters for AI Search Visibility

llms.txt improves AI Search visibility in four key ways:

  1. Easier discovery — AI systems can go straight to your most important content instead of crawling noisy HTML pages
  2. Reduced misrepresentation — Points LLMs to current, canonical sources, reducing hallucinations from outdated or deprecated content
  3. Aligned with AI-native behavior — Matches how users actually interact with AI by providing clear answers to “how, what, why, which” questions
  4. Better retrieval during query fan-out — Helps AI systems pull the right pages into the context window when breaking queries into sub-searches

llms.txt vs SEO

llms.txt does not replace SEO — it complements it by adding an AI-specific layer. Traditional SEO focuses on search engine rankings and organic clicks; llms.txt focuses on how AI engines retrieve and interpret your content. Both work together to provide full-spectrum visibility.

How to Create an Effective llms.txt

  1. Audit high-value content — Identify the pages that matter most for AI-driven questions (documentation, pricing, policies, guides)
  2. Create clean markdown versions — Strip navigation clutter, cookie banners, and layout code from key pages
  3. Organize into sections — Group content under headings like Docs, Pricing, Support, and Optional
  4. Publish at domain root — Place the file at yourdomain.com/llms.txt
  5. Keep it maintained — Update llms.txt whenever your key content changes to ensure AI models always have access to current information

Common Mistakes to Avoid

  • Listing every page instead of curating the most important ones
  • Linking to noisy HTML pages instead of clean markdown versions
  • Treating llms.txt as a one-time setup instead of maintaining it alongside documentation updates
  • Forgetting to update llms.txt when product details, pricing, or policies change

Related Terms