Step 1: Benchmark Your Site
Before changing anything, establish your baseline. Run your top 20 prompts on ChatGPT, Perplexity, Gemini, and Claude. Note where your WordPress content gets cited and where it does not. Also check the raw HTML output of your top five pages: view source, then look for heading hierarchy, JSON-LD blocks, and template chrome-to-content ratio. That raw HTML is what AI crawlers read.
Step 2: Fix robots.txt Defaults
Many WordPress installs ship a default robots.txt that unintentionally blocks AI crawlers. Check yoursite.com/robots.txt for any Disallow targeting GPTBot, ClaudeBot, PerplexityBot, or Google-Extended. If the blocks are not policy decisions, replace the file with an explicit AI-crawler-friendly configuration. WordPress admin has multiple places where robots.txt rules can be set: theme functions.php, SEO plugins, and .htaccess. Audit all three.
Step 3: Switch to a Performance-Oriented Theme
Many popular WordPress themes add significant template overhead (unused CSS, oversized wrapper divs, JavaScript-hydrated components). That chrome hurts passage-extraction because AI crawlers have to work harder to find the signal. If you are on a visual-builder theme (Elementor, Divi, WPBakery), strongly consider moving to a lean theme (GeneratePress, Kadence, Blocksy) or a full-site-editing block theme.
Step 4: Enforce Heading Hierarchy Through an Editorial Style Guide
WordPress gives editors freedom that often produces inconsistent heading structures. Write a short style guide: one H1 per page (usually your post title), H2 for major sections, H3 for subsections, no skipped levels, and descriptive headings (not "Introduction" or "Summary"). Retrofit existing top-traffic pages first.
Step 5: Install and Configure Schema Markup
Install Yoast SEO, Rank Math, or a dedicated schema plugin. Configure Organization schema site-wide. For each page type, configure the appropriate schema: Article for posts, Product for products, FAQPage for FAQ pages, HowTo for tutorials, and Person for author pages. Validate with Schema.org validator and Google Rich Results Test after each change.
Step 6: Enable Server-Side Rendering Where Needed
Classic WordPress is server-side by default. If you use a headless WordPress setup with a JavaScript front-end or heavy client-side plugins, confirm that critical content is present in the initial HTML response, not injected by JavaScript. Run your pages through a view-source check and a curl-style fetch to confirm the content AI crawlers will actually see.
Step 7: Publish llms.txt
Create a /llms.txt file at your domain root. Plugins for this are emerging, or you can publish a plain text file via FTP or a custom route. Include your site name, a one-paragraph brand summary, and a curated list of your highest-value pages with one-line descriptions. Keep it under 2 kilobytes and curate deliberately.
Step 8: Audit Internal Linking
Pages with strong internal linking get cited more often than isolated pages. Audit your top 20 posts: do they link to each other meaningfully? Do they link to canonical pillar pages? Tools like Link Whisper can accelerate the audit. Focus on contextual in-paragraph links, not just navigation menus or footer links.
Step 9: Monitor AI Crawler Traffic
Install a plugin or server-side log parser that surfaces AI crawler activity: GPTBot, ClaudeBot, PerplexityBot, Google-Extended, Applebot-Extended. Rising crawler traffic indicates your optimization is working. Falling traffic indicates a regression. Presenc AI tracks crawler-traffic correlations with actual AI citation rates.
Step 10: Set a Refresh Cadence
AI visibility for WordPress sites compounds with consistent publication and refresh. Update your top 20 posts quarterly: refresh dates, add new sections, update examples. WordPress sites that publish fresh or refreshed content weekly see higher AI citation rates than stale sites at the same word count.