GSC indexing question
Just published a large WP site for a client. Despite my suggestions against it, the client asked us to automatically generate a whole load of URLs with very similar content bar a few words and paragraphs (effectively low-value dupe content).
The site went live with 75,000 URLs, of which 74,000 are effectively duplicates, but they ARE canonically linked back to top-level pages of the site.
The client added the site to the search console before we sorted all the canonicals, and despite the fact we've managed to get these low-value URLS out of indexing, it's refusing to index any more of the site.
4000 URLs now submitted (reduced from 75,000), all decent content or correctly canonicalised content, but only 330 pages indexed.
All 4000 URLs are also now 1-2 clicks deep, so the bots can find them on the crawl, not widowed pages only available via the XML site map.
It's been two weeks - any suggestions?
It will take much more time.