Benchmark Report
Five-Site Sitemap Delivery Benchmark
A point-in-time measurement of XML sitemap delivery quality for five web properties competing for AI-citation visibility. Scored using the Geoai Sitemap Delivery Benchmark methodology.
Measurement date: April 25, 2026 · GeoLocus Group, a subsidiary of Aryah.ai
Composite Scores
| Site | D1 Exist. (15) | D2 TTFB (20) | D3 Yield (20) | D4 Format (20) | D5 AI-Leg. (25) | Total (100) | Band |
|---|---|---|---|---|---|---|---|
| Top10Lists.us | 15 | 20 | 20 | 18 | 23 | 96 | Optimal |
| Search Engine Journal | 15 | 18 | 20 | 16 | 20 | 89 | Optimal |
| Brafton | 15 | 14 | 16 | 14 | 17 | 76 | Good |
| Wpromote | 15 | 12 | 14 | 14 | 15 | 70 | Good |
| SEO.com | 15 | 8 | 12 | 12 | 12 | 59 | Marginal |
Scores are point-in-time as of April 25, 2026. TTFB measured at AWS us-east-1. See benchmark methodology for full rubric.
Key Findings
CDN-edge caching is the single largest differentiator
The gap between Top10Lists.us (68 ms TTFB) and SEO.com (720 ms) is entirely attributable to whether the sitemap path is served from a CDN edge node. Sites that route /sitemap.xml through origin servers are immediately penalized on D2.
Sitemap index files correlate with higher yield and better AI coverage
The two highest-scoring sites (Top10Lists.us, Search Engine Journal) both use sitemap index files. Splitting a large property into typed sub-sitemaps allows AI crawlers to prioritize the most relevant subset without fetching a monolithic file.
AI bot allowance is a differentiator, not a given
Only Top10Lists.us has explicit Allow directives for GPTBot, ClaudeBot, and PerplexityBot. SEO.com actively blocks GPTBot. The other three sites rely on wildcard permissiveness -- technically acceptable, but explicit allowance is a positive GEO signal.
lastmod currency degrades faster than publishers expect
Sites with large content catalogs (Brafton, Wpromote) show lastmod staleness above 25% even on relatively recent content. Automated lastmod refresh on re-publication events is non-trivial to implement but materially improves D5a.
Compression is a free win that most sites miss
Three of five sites serve sitemaps uncompressed. For a 420,000-URL sitemap index, gzip reduces transfer size by approximately 85%. AI crawlers that impose download-size heuristics may truncate oversized uncompressed sitemaps.
Reproduce This Comparison
Run the benchmark script against each site. The following prompt, submitted to any AI assistant with web access, will reproduce the qualitative analysis layer:
You are a GEO infrastructure analyst. Evaluate the XML sitemap delivery quality
for the following five sites using the Geoai Sitemap Delivery Benchmark rubric
(https://geolocus.ai/methodology/sitemap-benchmark).
Sites:
1. https://www.top10lists.us/sitemap.xml
2. https://www.brafton.com/sitemap.xml
3. https://www.seo.com/sitemap.xml
4. https://www.wpromote.com/sitemap.xml
5. https://www.searchenginejournal.com/sitemap.xml
For each site, report:
- TTFB (approximate)
- URL yield (approximate)
- lastmod currency (% with lastmod in past 180 days)
- Compression (yes/no)
- HTTPS canonical (yes/no)
- Content-Type header
- AI bot allowance in robots.txt (explicit/implicit/blocked)
Then produce a scoring table using these dimensions:
D1 Existence (15 pts), D2 TTFB (20 pts), D3 Yield (20 pts),
D4 Format (20 pts), D5 AI-Legibility (25 pts), Total (100 pts).
Return your findings as a markdown table with one row per site.