Has this ever happened to you? You open ‘Crawl Errors’ report in Google Search Console and see loads of 400s, 404s, and 500s?
The web development agency at InternetDevels wants to provide you with a few tips on how to fix crawl errors. Crawl errors not only impact how a website ranks in Google Search. If ignored, these pesky little flaws can also prevent your site from appearing in search engine results, making you invisible to your target market. Let’s find out why it is so important to track, detect and fix crawl errors in Google Search Console.
After reading this article you’ll understand the main principles of Google Search Console and will be able to fix all the crawl errors by yourself. However, if you don't have the time, our web experts can do that for you.
What is Google Search Console?
The Search Console is an open source tool which enables users see their websites as Google sees them. Or as Google puts it, Search Console lets users "monitor and maintain their site’s presence in Google Search results". If you don’t have an account setup on this service, it’s time to create one now.
The first thing you’ll see when logging in to the Google Search Console is the main dashboard. From here, you can easily monitor the general status of your website, including crawl errors, search analytics data and sitemaps. You can also review more specific and detailed data related to each of those categories:
- at ‘Search Appearance,’ you can view structured data, rich cards, accelerated mobile pages, and more.
- at ‘Search Traffic,’ you’ll find data regarding clicks, impressions, external and internal links and mobile usability.
- at ‘Google Index,’ you’ll see information on indexed, blocked, and removed URLs.
- at the ‘Crawl’ section, you can observe data regarding crawl errors, URL errors, or crawl stats.
If Google is experiencing trouble finding your website, or a specific page within it, then search engines won’t be able to find it either. Therefore, you should make sure that crawl and URL errors are kept to a minimum.
Crawl Errors in Google Search Console
Crawl errors occur when a search engine tries to reach a webpage but fails to do so. Google divides crawl errors into two groups:
- Site errors (which mean your entire site can’t be crawled)
- URL errors (which relate to one specific URL per error)
Let’s take a closer look at each group.
Site Errors in Google Search Console
- DNS Errors
A DNS (Domain Name System) error means that a search engine can’t communicate with your server either because the server is down or because there's an issue with the DNS routing to your domain. Usually this is a temporary issue.
How to fix it? First of all, Google recommends using the Fetch as Google tool to view how Googlebot crawls your page. If Google can’t fetch your page properly, check with your DNS provider to see where the issue is.
- Server Errors
When you see this type of error for your URLs, it means that Googlebot wasn’t able to access your URL, the request timed out, or your site was busy. This can also mean that your website has so many visitors that the server just couldn’t handle all the requests. Most of these errors are defined as 5xx status codes.
Server errors are probably the only type of crawl errors you should be really worried about. If Googlebot frequently encounters server errors on your website, it is a very strong signal to Google that something is wrong with your page and reduced rankings are sure to follow.
How to fix? This is Google’s official guide for fixing server errors:
“Use Fetch as Google to check if Googlebot can currently crawl your site. If Fetch as Google returns the content of your homepage without problems, you can assume that Google is generally able to access your site properly.”
- Robots failure
Before Googlebot crawls your website, it tries to crawl your robots.txt file to see if there are any pages you’d rather not have indexed. If that bot can’t reach the robots.txt file (i.e. robots.txt file doesn’t return a 200 or 404 HTTP status code), Google will postpone the crawl rather than risk crawling URLs that you don’t want crawled. That’s why you always want to make sure the robots.txt file is available.
How to fix? Double-check which pages you’re instructing the Googlebot not to crawl. If your file seems to be in order and you’re still getting errors, use a server header checker tool to see whether your file is returning a 200 or 404 error.
We have experience in fixing site errors. For example, we had a case involving robots.txt failure. The customer contacted us because he kept receiving robots.txt errors. After the detailed analysis of a website by our SEO expert, we figured out that there were mistakes in robots.txt file configuration. We fixed that issue and now the website is visible for Google Search.
URL Errors in Google Search Console
Common URL Errors
Common URL errors occur when a search engine tries to crawl a specific web page. Among them might be an occasional DNS error or server error for that specific URL.
Mobile-Only URL Errors
This type of error refers to the crawl failure that occurs on smart devices. If your website is responsive, mobile-only URL errors are unlikely to appear.
The most usual mobile-only URL errors are faulty redirects. Some websites use different URLs for desktop and smartphone users. A faulty redirect occurs when a desktop page incorrectly redirects mobile users to a page not relevant to their query.
Keep Calm and Fix your Crawl Errors in Google Search Console
Crawl errors are something you cannot avoid. Thankfully, however, they don’t necessarily have an immediate negative impact on your website’s performance. If you encountered any of them, try to correct the underlying cause as soon as possible. The first step we recommend is to mark all crawl errors as ‘fixed.’ This will help you deal with them in a more structured way. Irrelevant errors will not appear again, while the ones that really need fixing will return to the report soon.
Here are a few simple ways to fix your website crawl errors:
- Replace non-working external links with working links.
- Update invalid URLs.
- Maintain security subscriptions to prevent malware attacks.
Make it a part of your maintenance schedule to check for crawl errors every now and then. Monitoring the crawl errors report ensures you that your domain is kept free of broken links, which will be a SEO performance boost.
If you are not sure what to do with crawl errors in Google Search Console and need any further assistance or support, the website support and maintenance company is available for you 24/7.