Skip to content
Go back

LLMGenerator MCP Server: Generate llms.txt Files Directly in Claude, Cursor & Windsurf

Published:  at  10:00 AM

LLMGenerator MCP Server: Generate llms.txt Files Directly in Claude, Cursor & Windsurf

Key Takeaways

  • The LLMGenerator MCP Server exposes 9 tools for generating, managing, and validating llms.txt files — usable directly inside Claude, Cursor, Windsurf, or any MCP-compatible client.
  • Setup takes under two minutes: add a URL and your API key to your MCP config, and you’re done.
  • Your AI agent can autonomously crawl a site, generate a llms.txt file, and retrieve the result in a single conversation.

If you’ve ever had to tab out of your AI editor, open a browser, paste a URL into LLMGenerator, wait for the file, copy the result, and paste it back — this is for you.

The LLMGenerator MCP Server eliminates that round-trip entirely. It gives AI agents direct access to LLMGenerator’s full generation pipeline: crawl a site, generate an llms.txt file, check status, retrieve content, and validate quality — all without leaving your coding environment.

What Is MCP, and Why Does It Matter?

Model Context Protocol (MCP) is an open standard that lets AI agents call external tools during a conversation. Instead of requiring you to copy and paste between tools, the AI can take action directly — fetching data, running commands, writing files — based on your instructions.

Claude Desktop, Claude Code, Cursor, and Windsurf all support MCP servers natively. Once a server is configured, the AI gains access to its tools exactly like it has access to its built-in capabilities.

The LLMGenerator MCP Server registers 9 tools with your AI client. From that point on, you can ask Claude to “generate an llms.txt for example.com” and it will — autonomously, step by step.

What You Can Do with the LLMGenerator MCP Server

The server exposes a complete toolset covering the full llms.txt lifecycle:

Generation

ToolWhat it does
generate_llms_txtCrawl a website and start generating an llms.txt file
get_generation_statusPoll the async job until it completes and returns a site ID

Two generation modes are available: Simple (1 credit per URL — fast, uses existing page titles) and Enhanced (2 credits per URL — AI rewrites titles and descriptions for better LLM comprehension).

Site Management

ToolWhat it does
list_sitesList all previously generated sites with status and file URLs
get_siteGet full metadata for a specific site
get_llms_txt_contentRetrieve the actual llms.txt text (standard or full-text version)

URL Discovery

ToolWhat it does
discover_urlsCrawl a website to discover its URL structure without generating
get_discovered_urlsRetrieve the list of discovered pages

Validation & Credits

ToolWhat it does
validate_llms_txtValidate any llms.txt content — returns a 0–100 quality score, errors, and suggestions (no auth required)
get_credit_balanceCheck your credit balance and recent transactions

How to Connect It in Two Minutes

You need one thing: an API key from app.llmgenerator.com/settings/api-keys.

Claude Desktop

Open ~/.config/claude/claude_desktop_config.json (Linux/Windows) or ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) and add:

{
  "mcpServers": {
    "llmgenerator": {
      "url": "https://mcp.llmgenerator.com/mcp",
      "headers": {
        "Authorization": "Bearer llmgen_your_api_key_here"
      }
    }
  }
}

Restart Claude Desktop. Done.

Claude Code (CLI)

claude mcp add llmgenerator \
  --url https://mcp.llmgenerator.com/mcp \
  --header "Authorization: Bearer llmgen_your_api_key_here"

Cursor / Windsurf

Add the same URL (https://mcp.llmgenerator.com/mcp) and Authorization header in your MCP settings panel. The exact UI varies by version, but the values are identical.

A Typical Agent Workflow

Once connected, your AI handles the full flow autonomously. Here’s what happens when you say “Generate an llms.txt for acmecorp.com”:

Step 1 — Start generation

The agent calls generate_llms_txt with your URL and preferred settings. It gets back a jobId immediately — generation runs in the background.

generate_llms_txt(url="https://acmecorp.com", maxUrls=50, method="simple")
→ { jobId: "gen_abc123", status: "pending" }

Step 2 — Poll for completion

The agent polls get_generation_status every few seconds until the job completes and returns a siteId.

get_generation_status(jobId="gen_abc123")
→ { status: "completed", siteId: "site_xyz789", progress: 100 }

Step 3 — Retrieve the file

The agent fetches the generated content directly.

get_llms_txt_content(siteId="site_xyz789")
→ # Acme Corp
  > Enterprise software for modern teams.
  ## Products
  - [Platform Overview](/platform): ...
  ...

The entire flow runs inside a single conversation. No tabs, no copy-pasting, no context switching.

Real-World Use Cases

Onboarding a new client site Ask Claude to generate an llms.txt for a client’s domain, validate the output quality, and summarize the score and any issues — all in one prompt. Get a structured report back without touching a browser.

Auditing multiple sites Use list_sites to retrieve all previously generated sites, then loop through them asking the agent to re-validate each file and flag any that score below a threshold.

Pre-deploy checks in CI Agents running in CI pipelines can call validate_llms_txt against a file before it ships — no API key required for validation, so it costs nothing to integrate.

Discovery before generation Use discover_urls first to understand a site’s structure before committing credits. Inspect the URL list, filter out irrelevant sections, then generate with a targeted maxUrls count.

Under the Hood

The MCP Server runs as a stateless Cloudflare Worker at mcp.llmgenerator.com. Your API key is forwarded directly to the main LLMGenerator API — the MCP layer adds no extra auth complexity. Rate limiting is handled natively by Cloudflare: 60 requests per minute per API key, 10 per minute for unauthenticated requests.

The server is built with:

OAuth 2.1 support (browser-based login, no API key copying) is coming in a future release.

Get Started

  1. Grab an API key from app.llmgenerator.com/settings/api-keys — you’ll need a free account if you don’t have one yet.
  2. Add the server to your MCP client using the config above.
  3. Open a new conversation and ask your AI to generate an llms.txt for any public website.

No credits are needed to validate — validate_llms_txt is fully public. If you want to generate files, the free tier includes 50 credits to start, and generation costs 1–2 credits per page depending on the mode you choose.

The server endpoint is live at https://mcp.llmgenerator.com/mcp. Connect it once, and your AI workflow gains a complete llms.txt generation pipeline — permanently.


Have questions or feedback? Reach out to our team — we read everything.



Next Article
SEO, GEO, AEO & LLMO: Why the Future Belongs to Marketers Who Use All Four