rasmus_freeway

Submitting of large sitemap dropped my rankings

by @rasmus_freeway (116), 1 year ago

I work as an SEO on an online book store with a very, very big selection of books, many of such was never found in the Google index. Simply due to the crawl rate being to low and the site being too large.

To fix this I have submitted a very large sitemap containing all books. A lot of books have different types such as e-books, audio, paperback etc so there are duplicates. However these have canonicals to the primary entry of the book.

After submitting the sitemap my crawl budget has exploded, so now Google crawls 2-300K pages per day. My sitemap is 800K.

Now I am seeing my books drop like rocks in the rankings. All other sub pages are still running strong.

What should I do here? Since the cat is out of the bag? Is it "just" a fluctuation and will it bounce back or do I need to do something drastic?

Hoping for some guiding here :)

Best regards Rasmus

892 Views
0 Upvotes
3 Replies
3 Users
Sort replies:
Aatmia
by @Aatmia (5), 1 year ago

Many factors can cause your rankings to drop in Google. Most frequently, we see drops because of changes that were made to the website, but they can also be caused by an algorithm update, technical issues, improvements competitors made, SERP lay-out changes or a Google penalty.

jaap
by @jaap (1010), 1 year ago

Another reason can be a sitemap bug.

mehmsv
by @mehmsv (7), 1 year ago

best length of sitemap records is 1000. Try to use more than one sitemap like this: sitemap-post1.xml sitemap-post2.xml ...

best.

Join the forum to unlock true power of SEO community

You're welcome to become part of SEO Forum community. Register for free, learn and contribute.

Log In Sign up