How to Create and add Custom robots.txt File in Blogger 

If you are new to the field of blogging, then you will definitely not know about robots.txt. Because new bloggers do not set up Advance SEO for their blogs. So their blog ranking is not good on search engines and without SEO setting it takes a lot of time to make the blog popular.

Read More: How to Add a Popup Download Timer Button in Blogger

Custom robots.txt is an important part of blog SEO and our sole purpose in writing this post is that I want to tell new bloggers like you about robots.txt. By which you can make your blog search engine friendly.

What is Robots.txt

Robots.txt is a file that contains some code and important keywords. When you publish a new post, the search engine sends a crawler (or spider) to index that post. Like Google’s crawlers are Googlebot. So first, the crawler checks your blog’s robots.txt file, and this file informs the web crawler whether search engines can index this post or not.

How does work Custom robots.txt

Custom robots.txt is a code by which we can set some rules which search engine can index which posts and which posts cannot index. Understand in simple words, if you do not want any page or post of your blog to be indexed on the search engine, then you can disable that page through Custom Robots.txt. 

For example, if you have disabled one of your pages, the search engine will not scan that page and will not show it on the search results page

Blogger Custom robots.txt file to Boost SEO 

If you have not yet set up a custom robots.txt file in Blogger, then understand that you have missed an important part of blog SEO. Because there are some such pages in the blogger blog, such as search, archive, and label which you do not want to index but the crawler indexes them. So it is necessary to set up a custom robots.txt in the blog.

How to add Custom robots.txt File in Blogger

When we create a new blog, by default a robots.txt file is created in blogger which resides on the blogger’s server. If you want to see the robots.txt file of your blog, you can check it from your blog URL. All you have to do is add /Robots.txt with your blog URL and press Enter in the search bar.

For example https://codebarta.com/robots.txt

The code will appear on your screen like this.

User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: https://codebarta.com/sitemap.xml

Note: If the approval of Google Adsense has been received on your blog and you are using Adsense ads, then you have to remove 2 lines in the above text. So that Google Crawler can access your page.

User-agent: Mediapartners-Google
Disallow:

Follow these steps to add this code to your Blogger setting.

Step #1. Go to your blogger dashboard

Step #2. Go to Settings → Click on Search Preferences → Go to Crawler and Indexing Section.

Step #3. Click on the Edit link in Custom robots.txt.

Step #4.   Select yes to enable custom robots.txt.

Step #5. Now you will see a text box, you have to paste the custom robots.txt file in this box.

Step #6. Now click on the save changes button.

I hope now you have learned How to add a Custom robots.txt File in Blogger. If you are facing any problem in setting custom robots.txt in the blogger blog or if you have any other questions about blogging then you can ask by writing your comment in a comment.

Leave a Comment