Handling a Huge Amount of Crawl Errors
-
HI all,
I am faced with a crawl errors issue of a huge site (>1MiO pages) for which I am doing On-page Audit.
-
404 Erorrs: >80'000
-
Soft 404 Errors: 300
-
500 Errors: 1600
All of the above reported in GWT.
Many of the error links are simply not present on the pages "linked from". I investigated a sample of pages (and their source) looking for the error links footprints and yet nothing.
What would be the right way to address this issue from SEO perspective, anyway? Clearly. I am not able to investigate the reasons since I am seeing what is generated as HTML and NOT seeing what's behind.
So my question is: Generally, what is the appropriate way of handling this?
-
Telling the client that he has to investigate that (I gave my best to at least report the errors)
-
Engaging my firm further and get a developer from my side to investigate?
Thanks in advance!!
-
-
Usually an on page audit lists all of the problems and possible reasons why they are happening, not in depth info on how to fix all the issues. That is usually the next phase, "do you want me to work on the site or do you want your dev team to track down the cause of the issues and fix them"
It also depends what type of contract you have with him of course.