How to Fix "Blocked by Robots.txt" Error in Blogger

How to Fix "Blocked by Robots.txt" Error in Blogger


How to Fix "Blocked by Robots.txt" Error in Blogger


2. Check Your Blogger Settings

Blogger has a built-in feature of custom robots.txt blogger settings. If this was set up incorrectly, it could be blocking your entire site.

  1. Log in to your Blogger Dashboard.
  2. Go to Settings > Scroll down to Crawlers and indexing.
  3. Check Enable custom robots.txt.

[ INSERT IMAGE 2: Blogger Settings - Crawlers and Indexing Section ]

3. Use the Standard "Safe" Robots.txt

If you want to use a custom robots.txt to ensure your sitemap is submitted correctly, use this standard format:


Plaintext
User-agent: *
Disallow: /search
Allow: /
Sitemap: https://yourblogname.blogspot.com/sitemap.xml{codeBox}

4. Ask Google to Validate the Fix

Once you have updated your settings in Blogger:

  1. Go back to Google Search Console.
  2. Click on the Blocked by robots.txt error.
  3. Click the Validate Fix button.

Google will then re-crawl your site. Note that it can take anywhere from a few days to two weeks for the error to disappear from your dashboard.

Post a Comment

Previous Post Next Post