WordPress Robots.txt Boilerplate

If you use WordPress, you should use a more extensive robots.txt file to optimize your website’s crawlability for SEO.

This boilerplate helps with crawl rate optimization by better utilizing the crawl budget that Google has assigned for your website, as it prevents Googlebot from attempting to access unimportant pages while ensuring they can still access critical files for rendering.

All you need to do is remove any lines that don’t apply, e.g., if you want Google to be able to access search pages, then remove the two lines beneath the search page’s comment. Finally, replace the Sitemap lines with your own sitemap file URLs.

User-agent: Googlebot

# Allow files critical for rendering
Allow: *.js
Allow: *.css

# Allow AJAX - Do Not Remove
Allow: /wp-admin/admin-ajax.php


# Prevent crawl-budget waste on search pages
Disallow: /?s=
Disallow: /search/
# Prevent private admin areas from being crawled
Disallow: /wp-admin
# Prevent duplicate /feed/ pages from being crawled
Disallow: /*/feed/
# Prevent login page crawls etc
Disallow: /wp-login.php
# Prevent register page crawls etc
Disallow: /wp-register.php
# Prevent Trackback Neg SEO
Disallow: /trackback/


User-agent: *

# Allow AJAX - Do Not Remove
Allow: /wp-admin/admin-ajax.php

Disallow: /wp-admin
Disallow: /wp-login.php
Disallow: /trackback/
Disallow: /wp-register.php

# Add all sitemaps
Sitemap: https://example.com/sitemap_index.xml
Sitemap: https://example.com/post-sitemap.xml
Sitemap: https://example.com/page-sitemap.xml
Sitemap: https://example.com/category-sitemap.xml
Sitemap: https://example.com/post_tag-sitemap.xml