Latest News
Home » All Blogger Tips » What are robot.txt files? How to optimize Website by robot.txt files?
web development, SEO services, Web designing

What are robot.txt files? How to optimize Website by robot.txt files?

How to optimize Website by robot.txt filesA robot.txt file is the file that tells the search engines if they have to crawl a particular part or not. This file is placed in the root directory of the website. These are the files used to restrict the use of particular parts of the website that are not of any use to the users.
For website to be SEO rich, you must make changes to these files very carefully. Any wrong changes to these files will affect adversely to the search engine optimization of your website. So, if you are not aware of making changes to these files, read this article or leave the settings as they are the default.

Robots exclusion Standard:

This is a way to prevent web spiders and crawlers, such as Google bots, from viewing the parts of the website from being visible to your users, which is otherwise visible to users.

How to optimize website using robot.txt files:

Robot.txt file as told above prevents the website from being crawled by search engines. If you want to create a search engine friendly robot.txt files you can make changes in Google Webmasters. If you have sub-domains on your website then you will have to create, separate robot.txt files for sub-domain.

The proper optimization of Robot.txt files can boost your websites reputation, whereas if there is something wrong with your robot.txt files then it can kill your website.

Check out this video for some more information about ” Can we use robot.txt file for SEO? ”

 

How to create a Robot.txt files?

Robot.txt files are simple text files. Lets start the steps to create robot.txt files.

  • Open notepad or any text editor.
  • Here is an example of robot.txt file:

User-agent: googlebot

Disallow: /cgi-bin

The above lines will allow everything to be crawled by Google bots except the cgi-bin folder of the root folder. So, in similar way we can restrict the crawling of the things that you don’t want to be crawled by the Google.

What problems can I make?

There are many kind of problems or incorrect things that can be made by you while creating robot.txt files. When you start making the complex files of robot.txt then there are many problems that can be caused by you, if you don’t take required prevention’s. Typing and contradicting directives are the main problems that we go through.

  • Typing problems are problems are one that are misspelled user-agents, missing colons, directories etc.
  • Comments must not be used in robot.txt files.

Some common mistakes:

Not required spaces in directories and user-agents:

Problem 1:

User-agent: *

Dis allow: /support

After correcting it:

User-agent: *

Disallow: /support

Problem 2:

User-agent: *
Disallow: /support /cgi-bin /images/

After correcting it:


User-agent: *
Disallow: /support
Disallow: /cgi-bin
Disallow: /images

Use capital and small alphabets properly.

Robot.txt files for WordPress: Optimized Files

Editing the robot.txt files can be done through FTP accounts or you can use the plugins like Edit robot.txt files or Robots meta etc. If you add the sitemap URL with robot.txt files helps search engine bots to find your sitemap file and thus faster indexing of pages.

Sample of robot.txt files: 

Note: Replace the sitemap URL with your sitemap URL.

 

sitemap: http://www.allbloggingways.com/sitemap_index.xml

User-agent: *

# disallow all files in these directories

Disallow: /cgi-bin/

Disallow: /wp-admin/

Disallow: /wp-includes/

Disallow: /wp-content/

Disallow: /archives/

disallow: /*?*

 Disallow: *?replytocom

Disallow: /wp-*

Disallow: /author

Disallow: /comments/feed/

User-agent: Mediapartners-Google*

Allow: /

User-agent: Googlebot-Image

Allow: /wp-content/uploads/

User-agent: Adsbot-Google

Allow: /

User-agent: Googlebot-Mobile

Allow: /

 

How to check impact of Robot.txt files?

After making changes in robot.txt file you can check the robot.txt. For checking this, you go to Google Webmasters and then click on ‘Fetch as bot tool’ to see if your content can be accessed by Robots.txt file or not. Add your website post link and check if it gets fetched, if not then there is some problem with robot.txt files.

You can also check errors through Crawl errors option in Google webmasters.

So, these are the steps that will help you to optimize your website by robot.txt files.

 

About Vishal Sharma

Hi`i' m +Vishal, a Search Engine optimizer, internet marketer, web designer, affiliate marketer & founder of this blog. I love to learn and share new things. For any query or help you can contact me at any time.

One comment

  1. could you make a sitemap and robots.txt optimization for blogger ?