Can 17.5K not-found (404) URLs be exported from Google Search Console?
I'm trying to export 17.5K 404 URLs from Google Search Console. However, Google has a limitation to export 1000 URLs from the Search Console. I tried Google Search Console API, but it does not offer 404 data.
Now, how can I export this large amount of data? is there any way?
Thanks in advance.
Have a look on the below link if it is helpful to you,
https://support.google.com/webmasters/thread/256668770?hl=en&msgid=256670480
Thanks binayjah taking the time to help me. But the method you share above is not going to work for me. Google API does not offer not found (404) data.
Is there any other hacks?
The solution I mentioned is not fetching directly 404 not found data. It is URL inspection method which can leads you to get a list of the desired output. If you are a developer or you have an access to a developer then only it is useful. As you mentioned API, I believed you understand it together with coding involved.
There is no direct method to fetch those large number of URLs from the GSC.
You can directly import all the urls from GSC, as GSC has limit of 1000 urls.
What you can do you can?
First send validation request in GSC and wait for the vaildation. After that crawl your website through tools like Screaming Frog for 404 pages and fix that.
Then again send the validation request and repeat the process again.
@karansaxena92, Thanks for your suggestion. i have one question in my mind. what to craw using screaming frog? those 1000 urls that i send for validation?