eskandari

Crawled - currently not indexed - Fake URLs

by @eskandari (112), 1 month ago

We have several URLs in the page section with the error 'Crawled - currently not indexed.' These URLs do not exist, and I have no idea how Google bots are crawling them. How can I prevent Google bots from crawling URLs that do not exist?

All of these URLs are fake and have never existed on the website before.

enter image description here

125 Views
0 Upvotes
1 Reply
1 User
Sort replies:
nayeem359
by @nayeem359 (-1054), 1 month ago
  1. Use URL Inspection Tool in Google Search Console to identify discovery sources.

  2. Check internal links with tools like Screaming Frog to find references to these URLs.

  3. Audit external backlinks using Ahrefs or SEMrush to locate any external links pointing to these URLs.

  4. Remove outdated URLs from your sitemap and resubmit it in Google Search Console.

  5. Serve proper 404 or 410 responses for nonexistent URLs.

  6. Monitor server logs to trace how Googlebot is crawling these URLs.

  7. Use robots.txt to block crawling of unnecessary patterns if applicable.

  8. Remove problematic URLs using Google Search Console's URL Removal Tool.

Join the forum to unlock true power of SEO community

You're welcome to become part of SEO Forum community. Register for free, learn and contribute.

Log In Sign up
SEO Directory for agencies
Want to see your Ad here?

Do you have a product or service you'd like to promote on Seoforum? It's easy. Contact us.