pSEO for AI Visibility: A 300-Page Case Study
Programmatic SEO (pSEO) has been a staple of traditional search strategy for years — generating large volumes of targeted pages from structured data to capture long-tail search traffic. But does pSEO work for AI visibility? When you deploy hundreds of pages designed for AI crawlers rather than Google rankings, what actually happens?
This case study answers that question with first-party data. In March 2026, the Presenc AI team planned, built, and deployed 300 pSEO pages to our production domain, then tracked every AI crawler request using Cloudflare analytics. The result: 291 AI crawler requests within the first 24 hours, 43% page coverage within 24 hours, and 64.7% coverage within 72 hours. This is the most detailed public case study of pSEO deployment measured through the lens of AI crawler behavior rather than traditional search rankings.
Background and Motivation
The Presenc AI team set out to answer a series of practical questions that every content team faces when considering pSEO for AI visibility:
- Can programmatically generated pages attract meaningful AI crawler attention?
- How quickly do AI crawlers discover large batches of new pages?
- Do content-dense pSEO pages perform differently than thin templated pages?
- What percentage of a large page deployment actually gets crawled?
- Which AI crawlers respond first, and how do their behaviors differ?
- Is pSEO a viable strategy for increasing a brand's footprint in AI training data?
We designed this experiment to generate actionable answers backed by real data — not theory. Every finding in this report is derived from verified Cloudflare server logs, with AI crawler requests confirmed through reverse DNS lookups against known bot IP ranges.
Deployment Plan and Page Architecture
The 300 pages were deployed in a single batch on March 10, 2026. Here is how the page set was structured:
| Page Category | Pages | Avg Size (KB) | Content Characteristics | Target Topic Area |
|---|---|---|---|---|
| Research reports | 45 | 34.2 | Long-form prose, data tables, FAQs, citations, multiple H2 sections | AI search, GEO, crawler behavior |
| Glossary explainers | 120 | 19.8 | Definitions, use cases, related terms, moderate prose, FAQ schema | AI/SEO terminology |
| Geo-targeted hubs | 135 | 14.1 | Location-specific content, local data, shorter format, clear structure | AI services by region |
All pages were generated programmatically from structured data templates but with substantial human editorial oversight. Each template was designed to produce unique, valuable content — not thin doorway pages. Research reports averaged 3,500 words with original data analysis. Glossary pages averaged 1,800 words with comprehensive definitions and real-world examples. Geo-targeted hubs averaged 1,200 words with localized market data. Total content deployed: approximately 585,000 words across all 300 pages.
Technical Implementation
The deployment was implemented with specific technical considerations for AI crawlability:
- Framework: Next.js with static site generation (SSG). All pages pre-rendered as static HTML, ensuring instant server response with no JavaScript rendering dependency.
- Structured data: Every page included JSON-LD structured data — FAQPage schema for FAQ content, Article schema for research reports, and BreadcrumbList schema for navigation context.
- Sitemap: Updated XML sitemap submitted to all search engines. The sitemap was structured with lastmod dates and changefreq hints. No separate "ping" or indexing request was made to any AI platform.
- Internal linking: Each page included contextual internal links to 3-5 related pages within the pSEO set, plus links back to existing hub pages. Research pages received the most internal links (5+ average).
- robots.txt: All AI crawlers (GPTBot, OAI-SearchBot, ClaudeBot, PerplexityBot) were explicitly allowed. No crawl-delay directives were set.
- Server performance: Cloudflare CDN with edge caching. Average time-to-first-byte: 42ms. No AI crawler request received a server error during the observation period.
AI Crawler Response: The First 72 Hours
The AI crawler response began within 14 minutes of deployment and continued at a high rate through the first 72 hours. Here is the complete timeline:
| Time Window | New Requests | Cumulative Requests | New Pages Discovered | Cumulative Pages | Coverage % |
|---|---|---|---|---|---|
| 0-1 hours | 18 | 18 | 16 | 16 | 5.3% |
| 1-3 hours | 72 | 90 | 32 | 48 | 16.0% |
| 3-6 hours | 65 | 155 | 31 | 79 | 26.3% |
| 6-12 hours | 76 | 231 | 29 | 108 | 36.0% |
| 12-24 hours | 60 | 291 | 21 | 129 | 43.0% |
| 24-48 hours | 72 | 363 | 38 | 167 | 55.7% |
| 48-72 hours | 54 | 417 | 27 | 194 | 64.7% |
The deployment generated 417 AI crawler requests in 72 hours, discovering 194 of 300 pages (64.7%). The first 24 hours were the most active, with 291 requests reaching 129 pages. After 72 hours, 106 pages (35.3%) remained uncrawled by any AI bot — mostly smaller geo-hub pages in the 13-15KB range, reinforcing the finding that AI crawlers deprioritize thinner content.
Crawler Breakdown: Who Came and When
The distribution of requests across AI crawlers was heavily skewed toward OpenAI's bots, consistent with their market-dominant crawl infrastructure.
| Crawler | 72h Requests | Share of Total | Unique Pages | First Request |
|---|---|---|---|---|
| GPTBot | 297 | 71.2% | 148 | 14 minutes |
| OAI-SearchBot | 68 | 16.3% | 52 | 2h 18m |
| ChatGPT-User | 12 | 2.9% | 9 | 18h 42m |
| ClaudeBot | 24 | 5.8% | 18 | 4h 42m |
| PerplexityBot | 16 | 3.8% | 14 | 6h 11m |
OpenAI's three bots combined (GPTBot + OAI-SearchBot + ChatGPT-User) accounted for 90.4% of all AI crawler traffic. ClaudeBot and PerplexityBot were meaningfully more active in the 24-72 hour window than in the first 24 hours, suggesting these crawlers operate on longer discovery cycles. The 12 ChatGPT-User requests are particularly noteworthy — they represent actual user conversations where our newly deployed content was cited within 72 hours of going live.
Content Category Performance
Different page categories attracted different levels of AI crawler interest, with larger and more content-dense pages outperforming dramatically.
| Category | Pages | 72h Requests | Req/Page | Pages Crawled | Coverage % | Avg Discovery Time |
|---|---|---|---|---|---|---|
| Research reports | 45 | 178 | 3.96 | 42 | 93.3% | 1.8 hours |
| Glossary explainers | 120 | 142 | 1.18 | 89 | 74.2% | 6.2 hours |
| Geo-targeted hubs | 135 | 97 | 0.72 | 63 | 46.7% | 14.8 hours |
The performance gap is stark. Research reports achieved 93.3% AI crawler coverage within 72 hours at an average of 3.96 requests per page. Glossary pages reached 74.2% coverage at 1.18 requests per page. Geo-targeted hubs — the smallest and thinnest content — achieved only 46.7% coverage with 0.72 requests per page. Research content was discovered 8.2x faster on average than geo-hub content. This data strongly validates the strategy of investing in comprehensive, data-rich content for AI visibility.
Cost-Benefit Analysis
Was the investment in 300 pSEO pages worthwhile from an AI visibility perspective? Here is our honest assessment of the costs and returns.
| Investment Category | Cost | Notes |
|---|---|---|
| Content planning and template design | ~40 hours | Defining page categories, data models, content templates |
| Data collection and research | ~60 hours | Gathering data for research reports, market data for geo pages |
| Content generation and editorial review | ~80 hours | AI-assisted drafting with human editorial oversight |
| Technical implementation | ~30 hours | Next.js templates, structured data, sitemap, internal linking |
| Monitoring and analysis | ~20 hours | Cloudflare setup, log analysis, report writing |
| Total investment | ~230 hours |
Against this investment, the deployment generated 417 AI crawler requests in 72 hours, covering 194 pages. More importantly, 12 ChatGPT-User citation events occurred — meaning our content was actively cited in real ChatGPT conversations within days of deployment. For a domain launching its pSEO program from scratch, achieving measurable AI citation activity within 72 hours represents a strong return on content investment.
What Worked: Lessons from the Data
- 1. Content depth beats content volume. Our 45 research reports generated more AI crawler activity per page (3.96 requests) than the 135 geo-hub pages combined (0.72 requests). Deploying fewer, more comprehensive pages would likely have been more efficient per hour invested.
- 2. Static HTML with structured data is the ideal format. Pre-rendered pages with JSON-LD structured data were crawled quickly and completely. No JavaScript rendering issues were observed across any AI crawler.
- 3. Sitemaps drive discovery. The correlation between sitemap inclusion and crawl speed was clear. AI crawlers actively monitor sitemaps for new URLs — particularly GPTBot, which began crawling within 14 minutes of the sitemap update.
- 4. Internal linking matters for AI crawl speed. Pages with 5+ internal links were discovered an average of 2.1 hours faster than pages with fewer links. Invest in cross-linking within your pSEO page set.
- 5. Data tables are a strong positive signal. Pages with HTML tables received substantially more crawl attention than prose-only pages. If your pSEO pages include data, present it in tables.
- 6. The 72-hour window is the critical measurement period. Most AI crawler discovery happens in the first 72 hours. If a page has not been crawled within this window, it may need additional signals (more internal links, content enrichment) to attract crawler attention.
What We Would Do Differently
Transparency about what did not work as well as expected is essential for a credible case study. Here is what we would change in a second iteration:
- Reduce geo-hub page count and increase depth. The 135 geo-hub pages at 13-15KB each were the weakest performers. We would consolidate these into 40-50 pages at 25KB+, sacrificing breadth for depth that AI crawlers reward.
- Deploy in waves rather than a single batch. Deploying all 300 pages at once may have exceeded some crawlers' per-domain crawl budgets. Deploying in 3 waves of 100 pages over a week might have improved overall coverage percentages.
- Add more original data to glossary pages. The glossary pages performed adequately but below research reports. Adding original data points, comparison tables, and primary research to each glossary page would likely have boosted their crawl rates.
- Test explicit ping mechanisms. We relied entirely on sitemap-based discovery. Testing IndexNow pings, Google Indexing API submissions, and social sharing signals might have accelerated discovery, particularly for ClaudeBot and PerplexityBot.
The Case for pSEO as an AI Visibility Strategy
Based on this case study, we believe pSEO is a highly viable strategy for AI visibility — with important caveats:
- pSEO works for AI visibility when content is substantive. Research-grade pages (30KB+, 3,000+ words, data tables, structured data) performed exceptionally. Thin templated pages performed poorly. The "programmatic" in pSEO should mean "efficient production of quality content at scale," not "mass-generating thin pages."
- pSEO expands your training-data footprint. 297 GPTBot requests across 148 unique pages in 72 hours represents a meaningful expansion of the content OpenAI's models will train on. For brands seeking to influence how future AI models understand their category, pSEO is a scalable approach.
- pSEO generates measurable citation activity. The 12 ChatGPT-User requests within 72 hours — from pages that did not exist a week earlier — demonstrate that well-executed pSEO can achieve AI citation visibility rapidly.
- pSEO is more measurable than traditional GEO. Because we can track exact crawler requests at the page level, pSEO provides direct feedback on which content strategies attract AI crawler attention. This measurability is a significant advantage over less structured content approaches.
Key Findings Summary
- 1. 291 AI crawler requests in 24 hours, 417 in 72 hours. Deploying 300 pSEO pages generated substantial and rapid AI crawler interest, validating the approach for AI visibility.
- 2. 43% page coverage in 24 hours, 64.7% in 72 hours. More than half of pages were crawled within 3 days. Research pages hit 93.3% coverage.
- 3. Research content outperformed geo-hub content by 5.5x on a per-page basis. Content depth and data richness were the strongest predictors of crawler attention.
- 4. OpenAI crawlers dominated at 90.4% of all requests. GPTBot alone accounted for 71.2%. ClaudeBot and PerplexityBot were meaningful but secondary.
- 5. ChatGPT-User citation activity appeared within 72 hours. Newly deployed pSEO content was being cited in real ChatGPT conversations within 3 days of going live.
- 6. The 230-hour investment generated measurable AI visibility returns. In a space where AI visibility outcomes are notoriously hard to measure, pSEO provides a clean, attributable signal through crawler analytics.
How Presenc AI Helps
This case study was built using the same tools available to all Presenc AI customers. Our AI crawler analytics dashboard tracks every GPTBot, OAI-SearchBot, ClaudeBot, PerplexityBot, and ChatGPT-User request to your site — showing which pages are being crawled, how often, and by which bots. When you deploy new content (whether pSEO or otherwise), Presenc AI shows you the real-time crawler response so you can measure the impact of your content strategy on AI visibility. Our content optimization module also identifies which page characteristics drive the most crawler attention for your specific domain, helping you refine your pSEO templates for maximum AI crawler engagement. Start with a free site audit to establish your baseline AI crawler metrics before your next content deployment.