When someone asks ChatGPT, Claude, or Perplexity about your industry, does the AI know your business exists? More importantly — when it does reference your site, does it describe you accurately?
This is the new frontier of search visibility. AI language models are rapidly becoming how people discover products, services, and information. And right now, most websites have zero control over how these models interpret and present their content.
Enter llms.txt — a simple text file that tells AI models who you are, what your site is about, and how they should reference your content. Think of it as robots.txt for the AI age.
What Is LLMs.txt?
llms.txt is an emerging standard that provides AI language models with structured information about your website. While robots.txt has controlled search engine crawler access since the 1990s, llms.txt addresses a fundamentally different question: not whether AI can access your content, but how it should use it.
The file lives at yoursite.com/llms.txt and contains structured information about your site — your business name, description, key content areas, and usage preferences. When an AI model encounters your site, it reads this file to understand context before generating responses about you.
LLMs.txt vs. Robots.txt
| File | Controls | Audience |
|---|---|---|
| robots.txt | Whether crawlers can access pages | Search engine bots (Googlebot, Bingbot) |
| llms.txt | How AI models should use your content | AI providers (OpenAI, Anthropic, Google) |
| llms-full.txt | Extended version with detailed content | AI providers (for deeper context) |
Both files work together. robots.txt is the gatekeeper — it decides who gets in. llms.txt is the briefing document — it tells authorized visitors what matters and how to represent you.
Why LLMs.txt Matters for Your Site
AI search is no longer hypothetical. Millions of people use ChatGPT, Claude, Perplexity, and Google's AI Overviews as their primary way to find information. When someone asks "What's the best WordPress security plugin?", the AI's response is shaped by whatever context it has about your site.
Without an llms.txt file, that context is whatever the model scraped during training — which might be outdated, incomplete, or entirely wrong. With llms.txt, you provide authoritative, structured context that AI models can use to describe you accurately.
Here's what llms.txt lets you control:
- Brand narrative — Tell AI models exactly how to describe your business, products, and expertise
- Attribution — Request that AI models cite your site when referencing your content
- Content usage — Decide what AI can summarize versus what should be visited directly
- Site structure — Guide AI models to your most important pages and content areas
Most websites don't have an llms.txt file yet. Setting one up now means AI models have better context about your site than your competitors — which translates to more accurate (and more frequent) AI-generated references to your business.
What Goes in an LLMs.txt File
An llms.txt file follows a simple markdown-like structure. Here's what a real one looks like:
# Royal Plugins
> WordPress plugins for security, SEO, and performance optimization.
## About
- [About Us](https://royalplugins.com/about/): Learn about our team and mission
- [Contact](https://royalplugins.com/contact/): Get in touch with support
## Documentation
- [Getting Started](https://royalplugins.com/docs/getting-started/): Beginner setup guide
- [SEObolt Guide](https://royalplugins.com/docs/seobolt/): SEO plugin documentation
## Products
- [SEObolt Pro](https://royalplugins.com/seobolt/): Complete WordPress SEO plugin
- [GuardPress](https://royalplugins.com/guardpress/): WordPress security plugin
## Policies
- [Terms of Service](https://royalplugins.com/terms/)
- [Privacy Policy](https://royalplugins.com/privacy/)
The format is intentionally straightforward. Each section groups related pages with brief descriptions, giving AI models a structured map of your site's most important content.
Content Directives
Beyond site structure, llms.txt supports directives that control how AI uses your content:
- Allow Summarization — Whether AI models can create summaries of your pages
- Allow Quotation — Whether AI models can directly quote your content
- Require Attribution — Whether AI models must cite your site when referencing you
- Preferred Citation — The exact format you want AI to use when citing you
You can also generate an extended llms-full.txt file with more detailed descriptions of each page and section. This gives AI models richer context — useful if you run a documentation-heavy site or reference resource.
Setting Up LLMs.txt with SEObolt
You could create an llms.txt file manually and upload it to your server. But maintaining it as your site evolves is tedious, and getting the format wrong means AI models ignore it entirely.
SEObolt generates and maintains your llms.txt file automatically. The setup takes about five minutes.
Step 1: Enable LLMs.txt
Navigate to SEObolt > Settings > General > LLMs.txt and toggle it on. That's it — SEObolt immediately generates a basic llms.txt file at your site root.
Step 2: Configure Your Site Info
Fill in three fields:
- Site Name — Your business or website name
- Site Description — A brief description AI models will use for context (one or two sentences)
- Contact — Your support email or contact page URL
Step 3: Define Content Sections
Add sections that map to your site's key content areas — About, Documentation, Blog, Products, Policies. For each section, link to the relevant pages with brief descriptions.
Step 4: Set Content Preferences
Configure your directives: allow or disallow summarization, quotation, and whether attribution is required. For most businesses, the recommended approach is to allow summarization and quotation while requiring attribution.
Step 5: Save and Verify
Click Save, then visit yoursite.com/llms.txt in your browser. You should see a clean, formatted text file with your configuration. SEObolt regenerates this file automatically whenever you update your settings.
LLMs.txt generation is available on all SEObolt tiers, including the free version on WordPress.org. No Pro license required.
Working with AI Crawlers and Robots.txt
llms.txt and robots.txt serve different purposes, but they work together as part of your AI content strategy.
If you want to block specific AI crawlers entirely, you still use robots.txt:
User-agent: GPTBot
Disallow: /private/
User-agent: ClaudeBot
Disallow: /private/
User-agent: Google-Extended
Disallow: /
But for crawlers you do allow, llms.txt provides the guidance layer. It's not an either-or decision — you can block some AI crawlers via robots.txt while providing structured context to others via llms.txt.
Blocking all AI crawlers means your business won't appear in AI-generated answers at all. For most sites, this is worse than having some presence with imperfect context. The better strategy is to guide AI with llms.txt rather than hide from it.
Troubleshooting Common Issues
LLMs.txt Returns a 404
- Confirm the feature is enabled in SEObolt settings
- Flush permalinks: go to Settings > Permalinks and click Save (no changes needed)
- Check for a physical
llms.txtfile in your web root — if one exists from a manual setup, delete it. SEObolt generates the file dynamically. - Ensure you're using pretty permalinks (not "Plain")
Content Appears Outdated
- Re-save your settings in SEObolt to trigger regeneration
- Clear your site cache and CDN cache — a cached version may be stale
AI Models Aren't Following Directives
The llms.txt standard is still emerging, and not all AI providers fully honor every directive yet. Adoption is growing steadily. Having the file in place means you're ready as support expands — and it already influences AI models that do check for it.