WordPress Robots.txt Optimization for SEO

WordPress Robots.txt Optimization for SEO {How To}

As a beginner on WordPress, we might not understand what SEO is talk less of how to optimize Robot.txt for Search Engine Optimization. Robot.txt is a connection between your Site and Search Engines. It tells search engine bot which professionals known as SPIDER on how to crawl your site.

What is Robot.txt?

Robots.txt is a standard used by websites to communicate with web crawlers and other web robots. The standard specifies how to inform the web robot about which areas of the website should not be processed or scanned.

What is Robot.txt File?

A robots.txt file tells search engine crawlers which pages or files the crawler can or can’t request from your site. To explain better, It is a text file that website owners can create to tell search engine bots how to crawl and index pages on their site.

Why You Need a Robots.txt File for Your WordPress?

  • It Optimize search engines’ crawl resources by telling them not to waste time on pages you don’t want to be indexed. This helps ensure that search engines focus on crawling the pages that you care about the most.
  • It Optimize your server usage by blocking bots that are wasting resources.
  • There won’t be any connection between your site and search engines, so you need to have it on your website. And some time for fast indexing, you need Robot.txt file.

If you don’t have a robots.txt file, then search engines will still crawl and index your website. However, you will not be able to tell search engines which pages or folders they should not crawl.

What Your Robots.txt File Should Look Like?

On normal robot.txt file because many popular blogs use a very simple robots.txt file like rankmath SEO uses;

User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php

Sitemap: https://www.ursitename.com/sitemap_index.xml

While Yoast SEO uses,

User-agent: *
Disallow:

Sitemap: http://www.ursitename.com/post-sitemap.xml
Sitemap: http://www.ursitename.com/page-sitemap.xml

But we recommend here,

User-Agent: *
Allow: /wp-content/uploads/
Disallow: /wp-content/plugins/
Disallow: /wp-admin/
Disallow: /refer/

Sitemap: http://www.ursitename.com/post-sitemap.xml
Sitemap: http://www.ursitename.com/page-sitemap.xml

What this is saying it disallows search bots from indexing WordPress plugin files, WordPress admin area and links.

By adding sitemaps to robots.txt file, you make it easy for search engine bots to find all the pages on your site.

How to Create a Robots.txt File in WordPress?

We have different ways we can use to create Robot.txt File. Either with plugins or Manual editing.

Robot.txt using SEO Plugins
Robot.txt using RankMath SEO

If you are using the RankMath SEO plugin, then it comes with a robots.txt file generator.

You can use it to create and edit a robots.txt file directly from your WordPress admin area.

Simply go to RankMath » General Settings » Edit Robot.txt

By default, RankMath SEO’s robots.txt file generator add robots.txt file but you edit to add yours. Once you’re done, don’t forget to click on the ‘Save robots.txt file’ button to store your changes.

Robot.txt using Yoast SEO

You can use it to create and edit a robots.txt file directly from your WordPress admin area.

Simply go to SEO » Tools page in your WordPress admin and click on the File Editor link.

Yoast SEO page will show your existing robots.txt file. If you don’t have, you will have the opportunity to generate one. you can go ahead and add your own robots.txt rules. We recommend using the ideal robots.txt format we shared above. Once done, save.

Robot.txt using AllinOne SEO

The robots.txt module in All in One SEO lets you create and manage a robots.txt file for your site that will override the default robots.txt file that WordPress creates.

By creating a robots.txt file with All in One SEO you have greater control over the instructions you give web crawlers about your site.

Go to Tools in the All in One SEO menu. You should see the Robots.txt Editor and then Enable Custom Robots.txt. Click the toggle to enable the custom robots.txt editor.
You should see the Robots.txt Preview section at the bottom of the screen which shows the default rules added by WordPress.

Robots.txt file Manually Using FTP

For this, you will need two different things, FTP and your website hosting. Once inside, you will be able to see the robots.txt file in your website’s root folder. If you can’t find it then you have to create one. After saving your changes, you can upload it back to your website’s root folder.

How to Test Your custom Robots.txt File

Once you have created your robots.txt file, it’s always a good idea to test it using a robots.txt tester tool.

There are many robots.txt tester tools out there, but we recommend using the one inside Google Search Console.

  • Open the tester tool for your site, and scroll through the robots.txt code to locate the highlighted syntax warnings and logic errors. The number of syntax warnings and logic errors is shown immediately below the editor.
  • Type in the URL of a page on your site in the text box at the bottom of the page.
  • Select the user-agent you want to simulate in the dropdown list to the right of the text box.
  • Click the TEST button to test access.
  • Check to see if TEST button now reads ACCEPTED or BLOCKED to find out if the URL you entered is blocked from Google web crawlers.
  • Edit the file on the page and retest as necessary. Note that changes made in the page are not saved to your site! See the next step.
  • Copy your changes to your robots.txt file on your site. This tool does not make changes to the actual file on your site, it only tests against the copy hosted in the tool.

The reason why you are optimizing robot.txt is to give more room for search engines to crawl your website, We recommend that you follow the above robots.txt format to create a robots.txt file for your website.

If you have any issue on this, you can contact us on Facebook or use our comment session.

Share this Post

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top