How to write SEO Friendly Robots.txt for WordPress

Google+ Pinterest LinkedIn Tumblr +

What is robots.txt? – Robots.txt is a text file that website owners can create to tell search engine bots how to crawl and index pages on their site.

It is typically stored in the root directory also known as the main folder of your website.
The basic format for a robots.txt file looks like this:

User-agent: [user-agent name]
Disallow: [URL string not to be crawled]
 
User-agent: [user-agent name]
Allow: [URL string to be crawled]

Sitemap: [URL of your XML Sitemap]

 

Here is what a robots.txt example file can look like:

User-Agent: *
Allow: /wp-content/uploads/
Disallow: /wp-content/plugins/
Disallow: /wp-admin/

Sitemap: https://example.com/sitemap_index.xml

 

For WordPress sites, we recommend the following rules in the robots.txt file:

User-Agent: *
Allow: /wp-content/uploads/
Disallow: /wp-content/plugins/
Disallow: /wp-admin/
Disallow: /readme.html
Disallow: /refer/

Sitemap: http://www.example.com/post-sitemap.xml
Sitemap: http://www.example.com/page-sitemap.xml

 

This tell search bots to index all WordPress images and files. It disallows search bots from indexing WordPress plugin files, WordPress admin area, the WordPress readme file, and affiliate links.

By adding sitemaps to robots.txt file, you make it easy for Google bots to find all the pages on your site.

I hope this article helped you to learn new lesson

Share.

Leave A Reply