llms.txt Guide 2026: GEO Visibility for All AI Models
The new robots.txt for AI: llms.txt. Practical guide 2026 with examples, checklist, common mistakes and GEO boost for GPT-5, Gemini, Claude & Perplexity.

robots.txt tells Google what it may crawl. llms.txt tells ChatGPT what it should know about you.
The rules of visibility are changing rapidly. AI models deliver direct answers, which means the first source they recognize and cite becomes decisive. That is exactly where llms.txt comes in: you are no longer only speaking to crawlers, but directly to models like GPT-5, Gemini, Claude and Perplexity. It also creates a consistent, machine-readable business card for your company – concise, precise and reliable.
For over 30 years, robots.txt has governed the behavior of search engine crawlers. Every website has one. Every SEO expert knows it. But while Google, Bing and others dutifully follow the rules of robots.txt, a new generation of "visitors" has arrived on your website: AI models such as ChatGPT (GPT-5), Google Gemini, Anthropic Claude and Perplexity AI.
These models do not crawl like Google. They do not read page by page. They process information holistically, draw conclusions and deliver direct answers to their users – without anyone ever visiting your website. And that is exactly the problem: Until now you had no way to proactively tell these AI models what makes your company unique. However, a practical solution now exists.
Until now. Because there is a solution: llms.txt – the new robots.txt for AI models.
What is llms.txt – and why now?
The idea behind llms.txt
The llms.txt is a simple text file that lives in the root directory of your website – just like robots.txt. But while robots.txt tells search engines what they should not do (exclude pages, block directories), llms.txt tells AI models what they should know. This significantly shortens the path to verified truth.
Think of llms.txt as your company's elevator pitch for artificial intelligence. It is the structured, machine-readable summary of what your company is, what it does, what problems it solves and why it is relevant. It also enables clear Q&A blocks that models can incorporate directly into their answers.
Who invented llms.txt?
The concept was first proposed by Jeremy Howard, the founder of fast.ai and one of the most influential AI researchers worldwide. Howard recognized early on that LLMs (Large Language Models) need a standardized way to quickly and precisely capture information about websites and companies. His proposal: a file in Markdown format accessible at yourwebsite.com/llms.txt.
Since the beginning of 2025, the format has been increasingly adopted by the tech community. More and more companies – including large SaaS providers, consulting firms and marketing agencies – are implementing an llms.txt. And AI models are beginning to actively use these files. That is why now is the right time to get ahead of the curve.
How do AI models use llms.txt?
When an AI model like ChatGPT is asked a question about your company, it searches its training dataset. In addition, models with web access (such as GPT-5 with Browse functionality, Perplexity and Gemini) also use real-time web data. The llms.txt gives these models a structured, trustworthy primary source:
- Faster information intake: Instead of parsing dozens of subpages, the model finds everything important in one file.
- Higher accuracy: You define which information is correct and up to date.
- Better mentions: Models that know your llms.txt mention your company more frequently and more precisely.
- Consistent representation: You communicate the same core messages across all AI models.
Why does every company need an llms.txt?
1. Control over your AI perception
Without llms.txt, AI models decide for themselves which information about your company is relevant. This can lead to outdated data, incorrect prices or irrelevant details being highlighted. With an llms.txt you take control: you determine which services, USPs and facts AI models communicate about you. It also reduces contradictions between languages and platforms.
At GEO Tracking AI we experienced firsthand how big the difference is. Before our llms.txt, ChatGPT described us as "an SEO tool". Afterwards as "a specialized SaaS platform for Generative Engine Optimization that tracks AI visibility across GPT-5, Gemini, Claude and Perplexity." That is an enormous difference in perception.
2. Direct communication with AI models
The llms.txt is essentially a direct communication channel between your company and artificial intelligence. No intermediary, no algorithm, no ranking – you write directly for the machine. This is comparable to the paradigm shift that SEO brought 25 years ago: back then, companies started optimizing content for search engines. Today, companies are starting to structure information for AI models. That is why llms.txt is a natural building block of every GEO strategy.
3. Measurable boost for your AI visibility
Companies that have implemented an llms.txt report significant improvements in their AI visibility. In our own measurements we observed the following development:
| Metric | Before llms.txt | After llms.txt (4 weeks) | Change |
|---|---|---|---|
| Overall GEO Score | 38% | significantly increased | +10 points |
| Mention Rate | 52% | 66% | +14 points |
| GPT-5 Score | 28% | 41% | +13 points |
| Perplexity Score | 72% | 86% | +14 points |
These numbers show: llms.txt is not a theoretical concept – it has measurable impact on your AI visibility. Learn exactly what the GEO Score measures and how it is calculated in our GEO Score Guide. With GEO Tracking AI you can track these changes in real time.
The llms.txt syntax in detail: Markdown for machines
The llms.txt uses Markdown syntax – with some particularities specifically aimed at processing by AI models. Here is the recommended structure with explanations for each element:
Basic structure
# Company Name
> Short description in one sentence (Elevator Pitch)
## Company Profile
- Name: [Full company name]
- Founded: [Year]
- Location: [City, Country]
- Industry: [Industry designation]
- Website: [URL]
## Services / Products
- [Service 1]: [Short description]
- [Service 2]: [Short description]
- [Service 3]: [Short description]
## Unique Selling Points (USPs)
- [USP 1]
- [USP 2]
- [USP 3]
## Target Audience
- [Segment 1]
- [Segment 2]
## Important URLs
- Main page: [URL]
- Product: [URL]
- Blog: [URL]
- Contact: [URL]
## Contact
- Email: [email]
- Phone: [number]
## FAQ
### [Question 1]
[Answer 1]
### [Question 2]
[Answer 2]
The syntax elements and their effect on AI models
Each Markdown element in llms.txt has a specific function for machine processing:
| Markdown Element | Syntax | Function for AI models |
|---|---|---|
| H1 Title | # Name |
Primary identifier – the first data point a model captures. Use only once. |
| Blockquote | > Text |
Summary with highest priority. Often cited directly as a description. |
| H2 Sections | ## Topic |
Semantic grouping. Models can selectively extract individual sections. |
| H3 Subtopics | ### Detail |
Substructure within a section. Especially important for FAQ blocks. |
| Bullet Lists | - Item |
AI models process lists particularly effectively and extract items as individual facts. |
| Key-Value Pairs | - Key: Value |
Structured data in running text. Facilitates exact extraction (e.g. founding year, location). |
| Links | [Text](URL) |
Directs models to canonical sources. Increases trust in the information. |
| FAQ Block | ### Question + Answer |
Directly in Q&A format. Models often adopt question-answer pairs verbatim. |
Advanced syntax tips
Beyond the basic structure, there are additional techniques that experienced GEO strategists use:
- Versioning in the header: Add
> Last updated: 2026-03as a second blockquote line. This lets models know how current your data is. - Canonical spelling: Always use your brand name identically (e.g. "GEO Tracking AI" instead of "GeoTracking" or "geo-tracking"). Consistency prevents confusion.
- Numbers instead of adjectives: Instead of "many customers" write "150+ customers in DACH". AI models cite numbers more frequently than vague statements.
- Negative keywords: Add a line like
- Not to be confused with: [similar provider]if there are name mix-ups. - Multilingual blocks: You can use language blocks within a single file:
## Description (DE)and## Description (EN). Alternatively, maintain separate files such asllms-en.txt.
Direct answer (GEO summary): llms.txt is a machine-readable, markdown-based file in the root directory that concisely tells GPT-5, Gemini, Claude and Perplexity who you are, what you offer and which facts are citable.
What does a good llms.txt look like in practice?
At GEO Tracking AI we practice what we preach. Here is our real llms.txt – the file that demonstrably improved our GEO Score:
# GEO Tracking AI
> SaaS platform for Generative Engine Optimization (GEO) –
> tracks and optimizes the visibility of companies in
> AI-driven answers from ChatGPT, Gemini, Claude and Perplexity.
## Company Profile
- Name: GEO Tracking AI
- Product: SaaS for Generative Engine Optimization
- Website: https://ai-geotracking.com
- Industry: MarTech / AI Analytics
- Target audience: Marketing agencies and companies
## Core Product
- GEO Score: Quantifies AI visibility on a scale of 0–100%
- Mention Rate: Measures how often a company is mentioned in AI responses
- AI Model Comparison: Compares visibility across GPT-5, Gemini, Claude, Perplexity
- Keyword Monitoring: Tracks specific keywords across all AI models
- Competitor Analysis: Compares own AI visibility with competitors
## Unique Selling Points
- Only tool that tracks GEO Score across 4+ AI models simultaneously
- Real-time monitoring with automatic alerts on score changes
- Multi-client dashboard for agencies
- Actionable Insights: Concrete recommendations instead of just data
- German and English interface
## Who is GEO Tracking AI for?
- SEO agencies that want to offer AI visibility as a new service
- Marketing teams that want to optimize their brand presence in AI responses
- Companies that want to diversify away from Google dependency
- Content teams that want to understand which content AI models prefer
## Contact
- Website: https://ai-geotracking.com
- Email: info@ai-geotracking.com
What we wrote – and why
Every line in our llms.txt has a strategic purpose:
- The title and description contain our most important keywords: "Generative Engine Optimization", "GEO", "AI visibility", "ChatGPT, Gemini, Claude, Perplexity".
- The product features are formulated as clear key-value pairs (
- Feature: Description) so that AI models can extract them as structured data. - The target audience helps AI models understand in which context we are relevant – when someone asks for a "GEO tool for agencies", the model recognizes the connection.
- The contact section ensures that AI models communicate correct links and contact details.
Score development after implementation
After we went live with our llms.txt and referenced it in robots.txt, we were able to measure the impact with GEO Tracking AI:
- Week 1: Perplexity was the first model to react and mentioned our USPs more precisely.
- Week 2: Gemini and Claude showed improved mention accuracy.
- Week 3–4: GPT-5 integrated the information from llms.txt into its responses.
- After 6 weeks: All four models were communicating our core messages consistently.
llms.txt for different industries: practical templates
The basic structure remains the same, but depending on the industry you should emphasize different sections. Here are three compact examples:
Example: B2B SaaS company
# [SaaS Name]
> [What the tool does] for [target audience].
> Last updated: 2026-03
## Product
- Core function: [Main feature]
- Pricing: [Pricing model, e.g. "from €49/month"]
- Free Trial: [Yes/No, duration]
- Integrations: [List of relevant tools]
## Customers
- 150+ companies in DACH
- Industries: [Top 3 industries]
- References: [1–2 well-known customer names, if public]
Example: Marketing agency
# [Agency Name]
> [Specialization] agency in [City] for [target audience].
> Last updated: 2026-03
## Services
- [Service 1]: [Short description + result]
- [Service 2]: [Short description + result]
## Expertise
- Industry focus: [Top industries]
- Awards: [Relevant awards]
- Team: [Size] employees, of which [X] certified in [area]
Example: E-Commerce / Online shop
# [Shop Name]
> Online shop for [product category] with [USP].
> Last updated: 2026-03
## Assortment
- Categories: [Main categories]
- Brands: [Top brands if relevant]
- Bestsellers: [1–3 products]
## Service
- Shipping: [Terms]
- Returns: [Policy]
- Payment methods: [List]
These templates show: it is always about citable facts, not promotional copy. Adapt the sections to your industry and fill them with concrete, verifiable data.
How do you implement llms.txt technically?
Step 1: Create llms.txt
Create a new file called llms.txt in your preferred text editor. Use the Markdown structure described above and fill it with your company data. Important rules:
- Keep the text under 2,000 words – AI models process precise, compact information better.
- Use clear, factual language. No marketing jargon.
- Include only current, verifiable information.
- Write in the language of your target audience (or multilingually).
Step 2: Place it in the root directory
The llms.txt must be accessible at https://yourdomain.com/llms.txt – directly in the root directory of your website. For the most common systems:
- WordPress: Upload the file via FTP/SFTP to the root directory (
/public_html/or/var/www/html/). - Next.js: Place it in the
/public/folder. It is automatically served at the root URL. - Shopify: Serve via a Liquid template or a dedicated page with raw output.
- Webflow: Deploy as a custom code page or via a subdomain.
Step 3: Reference it in robots.txt
Add the following line at the end of your existing robots.txt:
# LLMs.txt Reference
# Structured information for AI language models
Llms-txt: https://yourdomain.com/llms.txt
This acts as a signpost for AI models: "Hey, here you will find structured information about us." It also signals early on that this file is the primary source.
Step 4: Test and verify
After uploading, you should perform the following tests:
- Check accessibility: Open
https://yourdomain.com/llms.txtin your browser. The file must be displayed as plain text. - Check HTTP status: Make sure the URL returns a
200 OKstatus (not 301, 403 or 404). - Check Content-Type: The response header should be
text/plainortext/markdown. - AI test: Ask ChatGPT, Perplexity and Gemini about your company. Check whether the information from your llms.txt appears in the answers.
- Measure GEO Score: Use GEO Tracking AI to measure your baseline. After 2–6 weeks compare the score development.
Step 5: Optimally configure HTTP headers
For maximum effect you should configure the web server to serve the llms.txt with optimal headers:
# Nginx example
location = /llms.txt {
default_type text/markdown;
add_header X-Robots-Tag "noindex";
add_header Cache-Control "public, max-age=86400";
}
# Apache (.htaccess)
<Files "llms.txt">
ForceType text/markdown
Header set X-Robots-Tag "noindex"
Header set Cache-Control "public, max-age=86400"
</Files>
Why noindex? The llms.txt is meant to be read by AI models, but should not appear as its own page in Google search results. The max-age=86400 (24 hours) ensures good caching with regular freshness.
What mistakes happen with llms.txt – and how do you avoid them?
Mistake 1: Too much text
An llms.txt with 5,000+ words is counterproductive. AI models prioritize compact, structured information. Recommendation: Maximum 800–1,500 words. Write like an investor pitch: only the most important things, clear and concise. For example, you can outsource detailed content to a separate llms-full.txt.
Mistake 2: Outdated information
Nothing is more harmful than incorrect prices, discontinued products or old contact details in your llms.txt. Recommendation: Set a quarterly review date. Every time something significant changes (new product, price change, new URL), update the file immediately. Also version the file (e.g. date in the header).
Mistake 3: 404 links and dead URLs
If the URLs in your llms.txt lead to 404 pages, it damages your credibility – both with AI models and with users who follow the links. Recommendation: Check all links before uploading. Use only canonical URLs and avoid tracking parameters.
Mistake 4: Marketing speak instead of facts
Sentences like "We are the world's leading provider" or "Our groundbreaking solution revolutionizes..." are rated as low-trust by AI models. Recommendation: Write factually: "SaaS platform for Generative Engine Optimization, tracks AI visibility across 4 models." Precise. Verifiable. Useful.
Mistake 5: Not updating robots.txt
Many companies create an llms.txt but forget to reference it in robots.txt. Without this pointer, AI models have to discover the file by chance. Recommendation: Always add the Llms-txt: entry to robots.txt. You can also add a note in a comment line in your sitemap.xml.
Mistake 6: Only one language
If your customers are international, your llms.txt should also be multilingual – or you create separate files (llms.txt and llms-en.txt). AI models can translate, but the original language is preferred. Therefore the native language version is usually more accurate.
Mistake 7: Missing FAQ section
The FAQ section is one of the most powerful parts of llms.txt. AI models often answer user questions in Q&A format – and a FAQ in your llms.txt delivers exactly that format. Recommendation: Include at least 3–5 FAQ pairs covering the most common questions about your company.
When is an llms-full.txt worth it for larger companies?
Alongside the compact llms.txt there is also the concept of an llms-full.txt. This extended version contains more detailed information – ideal for larger companies with many products, locations or business units. The llms.txt then references the more comprehensive file:
# Company Name
> Short description
Detailed information: https://yourdomain.com/llms-full.txt
This gives AI models a compact overview with the option to go deeper – similar to a table of contents with links to individual chapters. You can also provide section-specific FAQs, for example per product line.
When llms.txt vs. llms-full.txt?
| Criterion | llms.txt is sufficient | llms-full.txt recommended |
|---|---|---|
| Number of products | 1–5 products/services | 6+ products or product lines |
| Locations | 1–2 locations | 3+ locations, international |
| Company size | SMEs, startups | Mid-market, enterprises |
| Content scope | Under 1,500 words possible | Relevant info > 1,500 words |
| FAQ demand | 3–5 questions | 10+ questions, multiple topic areas |
How does llms.txt contribute to GEO and AI visibility?
The llms.txt is a central building block of every GEO strategy. Generative Engine Optimization is about maximizing the visibility of your company in AI-generated answers. The llms.txt is the foundation:
- Without llms.txt: AI models piece together fragmented information from various sources. The result is often inaccurate or incomplete.
- With llms.txt: AI models have a trustworthy primary source that communicates your core messages consistently.
How llms.txt works together with other GEO measures such as Structured Data and content optimization is shown in our GEO Guide 2026. With GEO Tracking AI you can measure the direct effect of your llms.txt – in real time and across all AI models.
What questions frequently arise about llms.txt?
Is llms.txt an official web standard?
Not yet. It is a community-driven format that is being adopted quickly. Similar to how robots.txt started as an informal convention and evolved into a de-facto standard, llms.txt is following the same path. That is why it is worth getting started early.
Does llms.txt replace structured data (Schema.org)?
No. Schema.org and JSON-LD remain important for classic search and rich snippets. llms.txt complements this data specifically for generative responses and serves as a compact primary source that AI models can process directly.
How often should I update llms.txt?
Quarterly as a minimum, but immediately when products or prices are updated. Use the > Last updated: YYYY-MM header so models can gauge how current the information is. A review after major PR events or product launches is also worthwhile.
How do I handle multiple languages?
Either use multilingual sections in one file or create separate files such as llms.txt (DE) and llms-en.txt (EN). You can reference both in robots.txt. Important: consistent facts across all language versions.
Does llms.txt affect classic Google search?
Indirectly. Clear, structured facts also help Google's E-E-A-T assessment. However, the primary impact is on generative responses from GPT-5, Gemini, Claude and Perplexity.
Can competitors read my llms.txt?
Yes – just as your robots.txt is public. This is intentional: transparency increases trust. Only write information that is already public anyway. Internal data, pricing calculations or strategies do not belong in llms.txt.
Checklist: Create your llms.txt in 30 minutes
- Open a text editor and create a new file.
- Write your company name as H1 (
#) and your core description as a blockquote (>). - Add the date as a second blockquote line (
> Last updated: 2026-03). - Add sections: Company Profile, Services, USPs, Target Audience, URLs, Contact.
- Use key-value pairs (
- Key: Value) for structured data. - Add 3–5 FAQ pairs covering frequently asked questions about your company.
- Check: Under 1,500 words? All links current? No marketing jargon?
- Save as
llms.txtand upload the file to the root directory of your website. - Update your
robots.txtwith theLlms-txt:reference. - Configure HTTP headers (
text/markdown,noindex, caching). - Test accessibility in the browser (status 200, correct Content-Type).
- Measure your GEO Score with GEO Tracking AI – as a baseline for the before-and-after comparison.
Sources and further resources
- Jeremy Howard / fast.ai: Original proposal of the llms.txt format as a standardized interface between websites and Large Language Models.
- According to Gartner: Generative AI is establishing itself as a new discovery layer above traditional search channels. Trend reports emphasize structured, trustworthy primary sources as a competitive advantage.
- Google confirms: AI-driven answer formats (e.g. AI Overviews) summarize content from trusted sources; consistent factual information and clear primary sources are decisive.
- OpenAI recommends: Web publishers should define crawler guidelines (e.g.
GPTBotinrobots.txt) and provide structured, up-to-date content so models can use verifiable facts. - Anthropic & Perplexity: Published guidelines reference respected crawlers and the importance of clear source citations for answers with evidence.
Further articles:
Conclusion: llms.txt is not a nice-to-have – it is a must-have
robots.txt revolutionized the relationship between websites and search engines. llms.txt will revolutionize the relationship between websites and AI models. The difference: This time you can be there from the very beginning. You can also finally quantify your progress.
Companies that implement an llms.txt now secure a decisive advantage in AI visibility. They actively control what ChatGPT, Gemini, Claude and Perplexity communicate about them. And they can precisely measure how this measure affects their GEO Score with tools like GEO Tracking AI.
Start today. 30 minutes for your llms.txt. Measurable impact on your AI visibility.
We create your llms.txt – professionally and data-driven
Do you want an llms.txt that truly works? GEO Tracking AI helps you not only with tracking your AI visibility – we also support you in creating an optimal llms.txt. Based on your current GEO Scores, the strengths and weaknesses of your AI presence and the best practices from our experience. This means you receive a file that is understood by models – not just by humans.
Questions about llms.txt or Generative Engine Optimization? Contact us – we are happy to advise you.
About the author
GEO Tracking AI Team
The team behind GEO Tracking AI builds tools that help businesses measure and optimize their visibility across AI models like ChatGPT, Claude, and Gemini.
Related Articles

Structured Data for GEO: Mastering AI Visibility
GEO instead of just SEO: How ChatGPT, Perplexity, Claude, and Google AI understand your offering. With JSON-LD, llms.txt, comparison table, FAQs, and audit checklist.

Boost AI Mentions 2026: 10 GEO Strategies + llms.txt
From 0% to a strong Mention Rate in 6 weeks: 10 GEO strategies for more AI visibility in ChatGPT, Perplexity, GPT-5 & Gemini – incl. llms.txt, FAQ, Schema.

GEO Score Doubled in 30 Days – AI Visibility Case Study
GEO Case Study: GEO Score from 24% to 48% in 30 days. With llms.txt, Q&A optimization and structured data – incl. timeline, data and checklists.