close.com — Agent-Adoption Score
25 checks evaluated · score 78 / 100
Discovery surfaces extend to agentic workflows — markdown-negotiation passes.
Subscores
Where close.com is most cited — and how those brands compare on agent-readiness
Top 2 categories where AI agents recommend close.com most. Each card shows the most-cited brands in that category and where each sits on the agent-adoption ladder.
- #144 L1
- #231 L1
- #341 L1
- #448 L1
- #548 L2
- #6This scan78 L3
- #123 L1
- #241 L1
- #352 L1
- #445 L1
- #544 L1
- #19This scan78 L3
Per-check breakdown
Discoverability2 pass · 0 fail100
robots.txt served with 1 User-agent directive
robots.txt is the first file crawlers and agents check for access rules; silence defaults to blanket-allow. Per RFC 9309.
Sitemap served at https://close.com/sitemap.xml (<urlset> root)
An XML sitemap is the route map agents use to find your pages. Without one they link-walk and miss deep or orphaned content.
Homepage returned no Link header — v1 does not penalize
Link: response headers expose related resources — API catalogs, service docs, alternates — before an agent parses HTML. Per RFC 8288.
Access Control3 pass · 0 fail · 1 informational100
robots.txt declares wildcard User-agent with explicit Allow: / and no cross-bot blanket blocks
A blanket-allow posture (wildcard User-agent, Allow: /, no cross-bot blocks) declares that every crawler is welcome. Informational — no pass/fail.
Content Readability6 pass · 1 fail · 2 informational79
llms.txt discovered at https://close.com/llms.txt
An llms.txt file gives agents a curated entry point into your docs — sitemap-equivalent, but sized for context windows. Per llmstxt.org.
llms.txt matches llmstxt.org structure
A well-formed llms.txt (H1 title, summary blockquote, linked sections) parses cleanly; a malformed one is skipped silently — worse than no file. Per llmstxt.org.
llms.txt has 7 H2 sections and 0 markdown links, no Optional section
Reports the shape of your llms.txt — Optional section, H2 count, link count — so you can tell at a glance whether agents get a skeleton or a full map.
Server-side rendering confirmed
Classifies the site as server-rendered, hydrated, or client-rendered (SPA) — what agents see without running JavaScript. A pure SPA reads as blank.
10/10 sampled pages under 50000 converted chars
Measures how much markdown each page feeds into an agent's context window. Under 50K fits cleanly; over 100K truncates mid-page — pages have context budgets too.
Correct HTTP 404 returned for non-existent path
Soft-404s (HTTP 200 on a missing page) make agents cache garbage as canonical content. An honest 4xx tells agents the URL is dead — drop it.
All 5/5 sampled URLs use same-eTLD+1 HTTP redirects or no redirect
Same-domain HTTP 3xx redirects work for agents. JavaScript redirects break agents without JS; cross-domain jumps read as tracking.
AGENTS.md not found at /AGENTS.md — HTTP 404 response
AGENTS.md is a coding-agent convention. ETH Zurich research (2026) found it often hurts those agents; we track presence to test the effect on websites. Informational.
Homepage response carries 1 of Cache-Control / ETag / Last-Modified: last-modified
Cache-Control, ETag, and Last-Modified headers let agents re-fetch only what changed — missing headers force full re-downloads. Informational.
Agent Endpoints3 pass · 1 fail · 2 informational57
Complete OAuth Protected Resource metadata served at /.well-known/oauth-protected-resource
Protected Resource metadata identifies which authorization server protects your API. Paired with oauth-discovery, agents complete auth without reading docs. Per RFC 9728.