Identifying 404s in Google Search Console

So we have recently been working with a few American agencies doing technical analysis and reports for some of their clients and wanted to share with you a handy strategy we have been doing.

One of the key questions they asked was why was the client getting over 14,000 404 errors in Google Search Console.

The first thing we did, chuck the website into Deepcrawl to identify any onsite broken links (I wasn’t expecting to find many and there wasn’t).

I asked the agency to get me just a week’s worth of server logs. I thought I had worked on some big websites in the UK. They were 6 large files.

I really didn’t want to boot up the Windows machine to use Exact Log Analyser tool. My mac is always open and I have the Screaming Frog Analyser tool so thought I would use this (it’s becoming my go-to tool for server log analysis on small files anyway, so why not test it on a larger file).

I put in the first of the files which had just short of 300,000 URLs – but more importantly, it had 2,374 404 errors. Not the 14,000 that is being shown in Search Console, but then again I am only looking at 1/6 of a week’s worth of server log data.

I started to manually check these and some the client had already sorted out and put in 301 redirects.

After I manually checked about 50 or some I got a little bored of seeing 301’s already in place. So I simply exported the list from The Log file tool and pasted into Screaming Frog and got the current response codes – not what Google saw. That got me down to 68. I then just needed to find the right place to redirect these too and do this for the other 5 files.

Then just send the new redirect file across over the pound and then get Google to come and recrawl the website. Then watch the 404s drop over time. As these aren’t internal links Googles unlikely to notice any difference immediately.

So just by using both pieces of Screaming Frogs products I was able to save several hours work and be more efficient.

 

Exit mobile version