suraj

Blocked by robots.txt Error in Google search console

by @suraj (114), 1 year ago

In my website's google search console, There is an error name Blocked by robots.txt, But when I am checking the issues, These types of pages show up. Could anyone please tell me how to resolve this error?

Example -

mywebsite.com/ScriptResource.axd?d=X4yq23xsXYY9Rjrbf2lHgemYH8LFIOTU7u-IAa7qVYuYs3Vl2ePnSNW_PRUE7ILjuotk9anq3X5jXnRawkSm6n5ZcBWOT1LlMWWCYSeoyL4HUg5BHCc6kTC1GcrXsuPi9ZghAqzn4bXVjgdb9fwGhhS6dwyfYvABZr1LF8fZgNZ6JpFvVni_LPQ_kyr58Yah0&t=49337fe8

556 Views
0 Upvotes
2 Replies
2 Users
Sort replies:
VOCSO Technologies
by @VOCSO Technologies (105), 1 year ago

It's possible that the pages are being restricted because of other issues on your website if the pages aren't specifically blocked in the robots.txt file. For instance, the noindex tag or a rel="canonical" element that directs users to another URL can forbid the pages. By analyzing the source code of your website and verifying sure there are no tags stopping Google from crawling and indexing the pages, you may look for these problems.

In the specific example you provided, the URL contains a script file which is likely being blocked by the robots.txt file. You should check the rules in the robots.txt file to make sure that this file is not being blocked.

binayjha
by @binayjha (4749), 1 year ago

Can you show the issue to locate the problem in better way? What information you have provided so far requires more clarification to resolve the issue.

Join the forum to unlock true power of SEO community

You're welcome to become part of SEO Forum community. Register for free, learn and contribute.

Log In Sign up