mirror of
https://github.com/Comfy-Org/ComfyUI_frontend.git
synced 2026-05-13 01:06:18 +00:00
## Summary Adds `/llms.txt` at the apex following the [llms.txt standard](https://llmstxt.org) — a curated, link-based markdown file that tells LLM-based search agents (ChatGPT search, Perplexity, Claude search, Google AI Overviews, etc.) what's most important on the site. It complements `robots.txt` (crawler permissions) and `sitemap-index.xml` (URL inventory) by giving AI agents a short, prose-friendly index they can ingest into a context window. ## What's in the file 28 links across 6 sections: - **Product** (6) — homepage, Local download, Cloud, Cloud pricing, API, Enterprise - **Workflows and Gallery** (2) — gallery + community workflows site - **Customers and Case Studies** (5) — customers index + 4 named studios (Series Entertainment, Moment Factory, Ubisoft Chord, Open Story Movement) - **Developers and Documentation** (4) — docs.comfy.org, ComfyUI repo, Comfy-Org GitHub org, registry.comfy.org - **Company** (6) — about, careers, contact, blog, privacy, terms - **Optional** (5) — `zh-CN` locale variant, long-form enterprise case studies, blog posts (de-prioritized per spec — agents can skip if context-limited) The intro paragraph names the four product surfaces (Local, Cloud, API, Enterprise), the named customers, and the use-case industries (VFX & animation, advertising, gaming, eCommerce/fashion) — so an agent that ingests only the prose still gets the elevator pitch. ## Verification - All 28 URLs verified live (`HTTP 200`) before commit. - File is plain markdown — no build step. Astro/Vercel will serve it from `apps/website/public/llms.txt` exactly as it serves `robots.txt` (which lives in the same directory and ships at `https://comfy.org/robots.txt`). - Will verify on the Vercel preview deploy after this PR opens that `curl -sI https://<preview>/llms.txt` returns `200` with a sensible `content-type`. (`robots.txt` currently serves as `text/plain; charset=utf-8` — `.txt` will likely do the same; that's fine for AI agents.) ## Decisions - **No `llms-full.txt` yet.** That variant inlines full prose of key pages and requires curating substantive content. Deferred to a follow-up — the marketing-site pages are mostly Vue-rendered hero/feature blocks rather than long-form prose, so a meaningful `llms-full.txt` would need either dedicated copy or a build step that flattens i18n strings + section text. Tracking separately. - **No comment line in `robots.txt`.** I considered adding a `# AI agents: see /llms.txt` comment above the `Sitemap:` directive, but decided against it: (a) the convention is to probe the well-known path `/llms.txt` directly, not to discover it via robots.txt; (b) `robots.txt` was just polished in #11823 with a deliberate compact design and adding a non-standard comment would muddy that; (c) zero implementations I checked actually parse robots.txt for llms.txt hints. Easy to add later if needed. ## Context Third of three follow-ups from the SEO/GEO sweep on 2026-05-02: 1. ~~Comfy-Router: add `X-Content-Type-Options: nosniff` to apex security headers~~ (separate PR on `Comfy-Org/comfy-router`) 2. ~~Cloudflare: enable "Always Use HTTPS"~~ (dashboard toggle, no PR) 3. **This PR** — add `llms.txt` for GEO discovery ## Testing - [x] All linked URLs return 200 - [x] File parses as valid markdown - [ ] Preview deploy serves `/llms.txt` (will verify once preview is up) ┆Issue is synchronized with this [Notion page](https://app.notion.com/p/PR-11830-feat-website-add-llms-txt-for-GEO-discovery-by-AI-search-agents-3546d73d365081a98c6bfc5301699f64) by [Unito](https://www.unito.io)