Enable Custom Robots.txt in Blogger

Know about Custom Robots.txt file:

Robots.txt is a text file which helps search engines crawl and index your website or blog effectively just by enabling the text file with allow and disallow conditions. 

Enabling Custom Robots.txt in Blogger is very simple. In this article you will find the necessary steps to enable custom robots.txt file in Bloggers.

Follow these steps to add or enable custom robots.txt in Blogger/Blogspot:

Step 1: Go to blogger.com and Log in to your Blogger account

Step 2: From left menu go to the "Settings" tab >> Click on Settings

Step 3: Now from setting, scroll down to the "Crawlers and indexing" section.

Step 4: Select Enable custom robots.txt toggle button to on.

Step 5: Click on "Custom robots.txt" >> In the text area that appears, Copy the below robots.txt and paste it to your blogger >> Click on the "Save Changes" button. 

Note: Before submitting the changes, add your blog address in sitemap URL and then click on Save

User-agent: *
Disallow: /search
Allow: /

Sitemap: https://<blog address>/atom.xml?redirect=false&start-index=1&max-results=500
Sitemap: https://<blog address>/sitemap.xml
Sitemap: https://<blog address>/sitemap-pages.xml

Verify the changes by visiting "https://<blog address>/robots.txt"

Step 6: Enable custom robots header tags toggle button On

Next Click on, Home page tags >> Enable toggle button all and noodp. Configure as in below screenshot and then click save

Next Click on, Archive and search page tags >> Enable toogle button noindex and noodp. Configure as in below screenshot and then click save

Next Click on, Post and page tags >> Enable toggle button all and noodp. Configure as in below screenshot and then click save

Best Robots Tester Tool:

robots.txt Validator and Testing Tool - Test and validate your robots.txt with this testing tool.