How to create robots.txt file for website 2020

If you want your website to have perfect performance, so you need to learn how to create robots.txt file.

The robots.txt file is just a text file that has a special code helps to improve the website performance, as it blocks unnecessary pages.

The robot .txt file helps websites with large content rather than simple ones.

Why do you need to create robots.txt file?

The robots.txt file improves the SEO of your website as follows:

  • It prevents the search engine from indexing the unimportant pages on your website.

The unimportant pages are like the admin page, plugin files, and the folder of the themes.

  • This may not be important for the new websites, But it`s necessary for big websites as they have a large number of websites.

So you need to let the search engine crawl the important pages only so that your website would have a fast indexing rate.

How to create robots.txt file for WordPress:

If your WordPress has Yoast SEO plugin:

  • Login to your WordPress page.
  • From the “SEO” menu, choose “Tools”.
  • Click on “Create rbots.txt file”.
  • The default rule for the robots.txt file is to disallow the crawling to your all pages, so it needs to be edited.
  • Open the robots.txt file, set your own rules, and save it.

If your WordPress has All in One SEO plugin:

  • Go to “All in One SEO” from the main menu.
  • From the dashboard, go to “Feature Manage”.
  • You will find robots.txt as one of the options on the page, click on Activate it.
  • Another menu will open, from this menu you can add your own rules.
  • Adding new rules is very easy as there is a specified place to type tour code and a button to save it.
  • The All in One SEO plugin allows you to prevent the search engine from crawling bad unimportant pages automatically.

How to create robots.txt file for WordPress manually:

  • Create a new text file.
  • Type the code.
  • Save the file with the format “.txt”.
  • In order to upload any file to your website, you need to connect to the website by FTB “File Transfer Protocol”.
  • After connecting, go to the folder “public_html”.
  • Now you can upload your robots.txt to the server from your PC.

How to create robots.txt file for Cpanel

There are a few steps you should follow to create robots.txt file for the Cpanel:

  • At first, you need to login to your account.
  • From the file manager, choose a domain.
  • From the website folder, select “New File”.
  • Type “robots.txt”.
  • Now select “Create New File”.
  • This file is ready to be edited.

For one domain, you can create one “robots.txt” file only.

How to create robots.txt file for blogger:

  • Login to your blogger account.
  • Open the setting.
  • Click on “Search Preferences”.
  • From search preferences, select “Crawlers and indexing”.
  • Edit the rbots.txt file by clicking on “Custom robots.txt”.
  • Now type your private code, or call a specialist to type a code to crawl the unnecessary pages.
  • After you finish editing, save the robots.txt file.
  • Now your blogger performance will be faster with the increased indexing rate.


Leave a Reply

Your email address will not be published. Required fields are marked *