Search Console Features You Would Want to Use

« Back to blog

Search Console Features You Would Want to Use

I bet you have already noticed how important Google’s Search Console is. In terms of internal issues about your site, its errors and traffic information, Google will keep you guided. The Search Console (formerly known as Webmaster Central/Tool) should be one of your frequently-used tools. To do that, let’s familiarize its features that we can take advantage of.

  1. Crawl Errors: If you are worried of what types of errors are running in your site, log in to your console account and click the “Crawl Error” tab. Google will show your site’s crawl issues for the past 90 days. The kind of errors you might encounter are site errors (DNS, robots and server errors) or URL errors (page issues.) Some errors could cause major problems that will not be helpful for business at all. But with this feature, we can immediately pinpoint the cause while Google provides tips in fixing these errors.
  2. Search Analytics: Most of the data you’ll need, to develop an effective SEO ethic, can be found here.  It provides critical search metrics about the website as well. Information about a keyword’s search rate, comparing desktop and mobile search, using filters to get a good view of a data – it’s just a click away.
  3. HTML Improvement: If you are worried about your page elements like the titles or meta descriptions, then a quick view at this tab might show you some light. If Google finds no problem in your site in this area, then you’re clear. If it does, you’ll be able to click the tab and understand the issue. Duplicate content is highly discouraged. To know which pages in your site are unfavorable to Google, you can start here.
  4. Fetch as Google: Usually, when you encounter a crawl error – let’s say it’s a site error like connection timeout – Google will want you to use this feature to find out if Googlebot can currently crawl the website. The same process goes if you encounter a connection failure. Basically, Fetch as Google can provide a report if your site can be crawled or not. This is crucial because if Google is not able to crawl your site, then it will result to a major drop in traffic. In using Fetch as Google, you can choose what type of Googlebot will perform the task. According to Google, a webmaster can only use 500 fetches per week.
  5. Robots.txt Tester: If you want all pages in your site to be indexed, then Google suggests you don’t use a robots.txt file anymore, even if it is an empty one. If in some cases that there are pages you wish to be blocked, then that’s the time you use a robots.txt file. Before submitting the updated version, you can open Google’s robots.txt tester to determine if the URL is properly blocked or not.

Google may make SEO complicated from time to time. But with these free tools providing the data we probably would need to pay for in the past, we can make the experience easier.