Unique Robots.txt File for Bloggers

A Step Further from WordPress with Its Premium Plugins: iThemes
06/10/2021
Top Ten Podcasting Tools to Recommend
07/15/2021
Show all

Unique Robots.txt File for Bloggers

Any person can do blogging– it’s enjoyable! Yet if you intend to enter into expert blog writing, you would certainly require to find out some technological information connected to seo. New blog owners frequently start their blog writing trip by uploading material on Blogger/Blogspot. Allowing a custom-made robots.txt data on these systems is extremely essential. Typically individuals inquire about what would certainly be the very best custom-made robots.txt setups for their blog site. Allow’s discover everything today!

Robots.txt is a basic text file that educates internet spiders (likewise referred to as crawlers or robots) regarding which parts of a web site or blog site to be crept as well as which components need to not be crept.

The Importance of Robots.txt File

Well, the success of any type of expert blog sites typically depends upon just how Google online search engine rates your blog site. We save a variety of posts/pages/files/ directory sites in our web site framework. Typically we do not desire Google to index all these parts. As an example, you might have a declare interior usage– as well as it is unusable for the online search engine. You do not desire this data to show up in search results page. As a result, it is prudent to conceal such documents from online search engine.

Robots.txt documents consists of regulations which all the leading internet search engine honor. Utilizing these instructions you can give instructions to internet crawlers to neglect specific sections of your website/blog.

Unique Robots.txt for Bloggers

Since Blogger/Blogspot is a cost-free blog writing solution, robots.txt of your blog site was not straight in your control. Now Blogger has actually made it feasible to make changes as well as develop a Custom Robots.txt for every blog site. Robots.txt for a Blogger/Blogspot blog site looks usually similar to this:

User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: http://example.blogspot.com/feeds/posts/default?orderby=UPDATED

Add Custom Robots.txt File on Bloggers

  • Go to your blogger dashboard
  • Open Settings > Search Preferences > Crawlers and indexing > Custom robots.txt > Edit > Yes
  • Here you can make changes in the robots.txt file
  • After making changes, click Save Changes button

View the Existing Custom Robots.txt File

In order to view the existing custom robots.txt for your blog, go the to following URL:

http://www.yourblog.blogspot.com/robots.txt

Needless to say, please replace yourblog with the name of your blog.

Attributes of Custom Robots.txt File

There are several simple directives in Custom Robots.txt file. Here is the basic explanation of these directives so that you can make informed changes in your file.

Wildcards

Following wildcard characters are often used in robots.txt files.

* means all, everything

/ means root directory

User-agent

This directive indicates the web crawlers to which the settings in robots.txt will apply.

Disallow

It directs the web crawlers not to crawl the indicated directory or file. For example

Disallow: / would tell web crawler not to crawl anything in your blog (because you’re disallowing the root directory).

Disallow: /dir/* would direct web crawler not to crawl any file under /dir/ directory.

Disallow: /dir/myfile.htm would direct web crawler not to crawl myfile.htm under dir folder. Crawler will crawl all other files under dir directory.

Allow

This directive specifically asks a web crawler to crawl a particular directory or file. For example:

Disallow: /dir/myfile.htm

Allow: /dir/myfile.htm

The overall meaning of the above two lines is that the crawler should crawl /dir/myfile.htm

The first line bars the crawler but the second line again allows the crawler to crawl.

Sitemap

Sitemap is a very important file in your website/blog. This file contains the structure of your website. It helps web crawlers to find their way through your blog. The Sitemap: directive tells the crawler the location of your sitemap file. In case of Blogger/Blogspot — you can leave this line as it is.

Leave a Reply

Your email address will not be published. Required fields are marked *