We have a client right now whose ranking took quite a dip lately. As we continue to help them fix errors documented in Google Search Console, one of the glaring issues is 404 Page Not Found errors. As companies migrate sites, many times they put new URL structures into place and old pages that used to exist do not exist anymore.
This is a HUGE problem when it comes to search engine optimization. Your authority with search engines is determined by how many people are linking to your site. Not to mention losing all the referring traffic from those links that are all over the web pointing to those pages.
An additional problem is that, since these pages don’t exist, there’s often no Google Analytics code to get executed on those pages to capture statistics on how much traffic you’re losing to 404 errors. Some content management systems do have 404 pages (like WordPress)… but other content management systems do not. They just serve up a Web Server 404 page.
To test your site, just go to a page that does not exist, view the source, and look for your analytics script. If it doesn’t exist, you’re going to have to add it. When you add it, keep in mind that the page that’s tracked by Google Analytics will ONLY be the 404 page, not the actual page that’s missing. We can fix that, too.
You can replace the page URL with a page URL that has the page requested as well as the referring page. If you’re using Google Analytics, replace this code:
With this code:
_gaq.push(['_trackPageview', '/404.html?page=' + document.location.pathname + document.location.search + '&from=' + document.referrer]);
If you’re using WordPress, you can modify your footer and add this snippet (or you can just add the script in your 404 template:
Now you’ll be able to go into your Google Analytics account and see the new page with all of the URLs that are not found – as well as the referring site. If it’s a huge problem, I’d also recommend writing some server-side script that sends you a message with these errors. Be careful, though! Hackers and crackers are always crawling sites looking for files that may have security holes.