toastdesign

GSC indexing question

by @toastdesign (112), 1 month ago

Just published a large WP site for a client. Despite my suggestions against it, the client asked us to automatically generate a whole load of URLs with very similar content bar a few words and paragraphs (effectively low-value dupe content).

The site went live with 75,000 URLs, of which 74,000 are effectively duplicates, but they ARE canonically linked back to top-level pages of the site.

The client added the site to the search console before we sorted all the canonicals, and despite the fact we've managed to get these low-value URLS out of indexing, it's refusing to index any more of the site.

4000 URLs now submitted (reduced from 75,000), all decent content or correctly canonicalised content, but only 330 pages indexed.

All 4000 URLs are also now 1-2 clicks deep, so the bots can find them on the crawl, not widowed pages only available via the XML site map.

It's been two weeks - any suggestions?

108 Views
0 Upvotes
1 Reply
1 User
Sort replies:
binayjha
by @binayjha (4749), 1 month ago

It will take much more time.

Join the forum to unlock true power of SEO community

You're welcome to become part of SEO Forum community. Register for free, learn and contribute.

Log In Sign up