2

How to add custom robots.txt file in Blogger

Step by step guide how to create custom robots.txt for your blog and how to add robots.txt in your blogger blog.

Are you create any custom robots.txt file for your blog? Are you adding that robot.txt file in your blogger blog? If NO. Let see this post, here I’m going to discuss about robots.txt file and how to create and add custom robots.txt file in blogger blog.

What is robots.txt?

Robots.txt is a file which instructs search engine crawlers which page to crawl and which are not. Search engine bots always go through robots.txt file instruction to crawl any Webpage.

In blogger blog by default each website has an unique robots.txt file. To check your website robots.txt file just go this link – www.yourdomain.com/robots.txt [change yourdomain word with your domain name]

By default blogger blog robots.txt will look like this –

[xml]User-agent: Mediapartners-Google

Disallow: 
User-agent: *
Disallow: /search

Allow: / 

Sitemap: http://www.yourdomain.com/feeds/posts/default?orderby=UPDATED[/xml]

Let see some explanation of above robots.txt file.

User-agent : mediapartners-Google :  This line stands for Google adsense users, if you are a Google adsense publisher and show Google adsense ads on your blog keep this line in your robots.txt file. This line allow Google adsense crawler to crawl your blog to show best relevant ads. If you are not an adsense publisher you can remove this code.

Disallow: disallow mean which page or category you want block from adsense crawling. You can add Allow:/ line instead of Disallow: line both are same and adsense friendly.

User-Agent: * User-agent * (asterisk) allow all search engine bots crawl your blog.

Disallow :/search  It block search engine bots to crawl /search pages.

Allow: / Allow search engine bots to crawl all pages.

Sitemap : This is your blogger blog default sitemap. You can learn more just go our sitemap section.

Which pages I should block?

Normally by default blogger blog labels are blocked for search engine bots. Idexing category pages in search result may occur duplicating content. You can block other pages from search engine bots like privacy policy, term and condition page.

How to block particular post or page from crawling?

Let take an example post and page link,

www.yourdomain.com/yyyy/mm/post-title.html

www.yourdomain.com/p/page-title.html

To block any post or page from search engine crawling. Just copy the URL next to your domain name and place that piece e of URL with Disallow: tag in robots.txt file.

For posts – Disallow:/yyyy/mm/post-title.html

For pages –   Disallow:/p/page-title.html

How to add custom robots.txt file in blogger blog ?

 Log into your blogger account, go dashboard >> setting>> Search preference>> crawling and Indexing >> Custom robots.txt. Now edit your robots.txt file and add below code in your robots.txt file.

custom robots.txt file

[xml]User-agent: Mediapartners-Google
Disallow:

User-agent: *
Disallow: /search
Allow: /

Sitemap: http://www.yourdomain.com/ atom.xml?redirect=false&start-index=1&max-results=500 [/xml]

In this robots.txt file I make one changes, I add custom sitemap instead of default sitemap. For more about sitemap go here.

Note – don’t forget to change domain name with your domain name.

Hope this post help you to understand about robots.txt and how it works and how to customize. Do share this post on social networking sites. If you have any query feel free leave a comment below. Take care of you and your family. J

How to add custom robots.txt file in Blogger
Rate this post
Click Here to Leave a Comment Below 2 comments