Checking For Errors In GSC

By The Black Fin Team

Posted:

Category: Random


What is it?

Google Search Console (GSC) (formerly Webmaster Tools) is a resource for business owners or webmasters to monitor the technical performance of a website over time.

Why it’s important:

GSC is able to detect errors about your website that you may not be able to find on your own or from other tools. It’s also a tool created directly by Google so it has a higher chance of having more accurate data than 3rd party tools.

Signing In To Google Search Console

Can get to Google Search Console by going to google.com/webmasters and then signing in. Select your web property to work at. If your website uses “www,” make sure to use “www” in your login.

Once you login, you’ll be looking at the dashboard for your website. The dashboard here has information you can find in many other places, so we’ll ignore it for now.

Messages

Messages will appear when Google has something to tell you. Google will identify any issues that your website may have. Any penalties will appear here.

Search Appearance

The first thing we’ll look at under “Search Appearance” is “Structured Data.” You’ll want to look at the bottom of the page to see if any of your websites have any errors. If they do, you can click on the site with errors to see the details and which webpages have the errors. Once you know that, you can fill out the missing data.

If we take a step back to the overview page for the “Structured Data,” we can see a graph with errors over time. If you see a big spike in this graph, you will likely be able to correlate the spike with errors you see on your webpages. Errors that arise from issues with your review markup, local business markup, or your social media markup should cause some concern for you.  

After “Structured Data,” you’ll want to move along to “Rich Cards.” Click the link and you’ll see exactly what rich cards are. You may or may not need rich cards for your site.

Next is the “Data Highlighter.” If you can add your own schema markup to your website, we would advise against using this. Using the schema markup will help with search engines other than Google, like Bing.

“HTML Improvements” will allow you to see issues with your title tags and meta descriptions. There are better tools for dealing with your title tags and meta descriptions, such as Screaming Frog, so we would recommend using them instead.

“Accelerated Mobile Pages,” otherwise known as AMP Pages, are also under “Search Appearance.” If you use AMP Pages, any info you’ll need will be in this section.

Search Traffic

“Search Analytics,” the first tab under “Search Traffic,” shows information about your website such as queries, the pages, impressions, clickthrough rate, and many more. We will dive deeper into this later on. Many of the terms you find on this page will be ranking under 30, which means the likely don’t have a click-through rate at all, and they are probably terms you’re not trying to rank for anyway. Don’t pay too much attention to the exact numbers, pay more attention to the graph of the numbers to make sure there are no major downfalls. The same approach applies to looking at the Average Position.

Using “Queries,” “Pages,” “Countries,” and “Devices you’ll be able to see the clicks, impressions, click-through rate, and position for each of these top terms. You can combine these filters, as well, in the event that you wanted to see queries for your homepage or other combinations. When you combine filters, you should see what you’re filtering for underneath “Queries,” “Pages,” etc where it would have said “No filter.” Through this technique, you could technically view all the queries for the homepage from the United States on mobile devices.

Next, we’ll take a look at the “Links to Your Site” section. We’ll open up all three sections of this page: “Who links the most,” “Your most linked content,” and “How your data is linked.”

Under the “Who links the most” section you will find the domains that link to your website the most. It will not include all of your backlinks, you should use Majestic and Ahrefs to find all of those.

You can see how many links you have from particular websites and how many pages are being linked to it. If we download the table, we can see the actual URLs, which will allow you to determine whether some of the URLs are coming from bad websites. A couple of bad links are not something to worry about, though, or else you’d be worrying non stop.

You want to make sure that your most important pages have the most links. You want the main practice areas of your business or website to be the second highest in terms of links.

Anchor text is the text that is used when linking to your website. When looking at it, you want to make sure there is no fishy anchor text being used, like spam or for products like Viagra and Cialis. If you have these kinds of things in your anchor text, you may want to be concerned of a negative SEO attack. Monitor the top 20 results for your anchor text. You want naked anchor text, or the web address itself, in your top results.

Under Search Traffic you can also see an overview of the total links to your site.

Internal links are the links on your website that link to other pages on your website. Get an overview of the overall internal link structure. You can download this table as well, which will show you more information.

See “Manual Actions” below.

International Targeting is used to target both countries and different languages. For example, in certain regions with high Spanish speaking populations, you may want to have Spanish versions of your websites.

Mobile usability is as important as anything for your website nowadays. This section will help you make sure your mobile version of your site is up to par.

Google Index

Index Status shows how many pages for a website are indexed on Google. Google Search Console is notoriously inaccurate with these results. So the best way to determine how many of your pages are indexed is to you use a site search. You would go onto Google and type in “site:mydomain.com” and press enter. That will tell you how many pages of your site are indexed by Google.  

If you see a dip in the amount of pages that are indexed, you may want to be concerned. You can look to see what the issue may be on the Index Status/Indexed Pages page.

Blocked Resources will show you if there are pages that Google should be crawling but can not. If there are blocked resources, the GSC will show you what those might be and what pages are linking to that resource that is blocked.

We would not advise using Remove URLs. This temporarily removes a URL from your search results. If you actually want to remove a URL from Google permanently, then you’ll want to set the page to “No Index”

Crawl

Crawl Errors are things that experience a 404 error. In the Crawl Errors we can see when these errors occurred, and also whether they occurred on a the mobile or the desktop version of your site. These won’t generally affect your rankings, but they are errors that people try to avoid and fix. For 404 errors that occur for pages that have never existed, you can simply just mark them as fixed. If the error is continuously happening, redirect those people getting that error to either your homepage or the web page they are searching for.

Crawl Stats will show you information about how Google is crawling your website. You can see the number of pages that are crawled per day, average number of pages crawled per day. If you’re a new website and you release 500,000 web pages, it will take a very long time before Google can crawl through all of those pages on your site, because your site does not have as great of significance to Google. If you see spikes in the amount of kilobytes it’s taking to crawl, it means that your web pages have more data, perhaps because of added images or other media. Keep your time spent to load the page as low as possible, ideally under 1,500 milliseconds to load a page.

Fetch as Google allows us to see how Google views our website. If you saw some issues with your website and were about to make changes, this would be the tool to use to have those changes as seen as quickly as possible by requesting an index. Fetch and Render will show you both the code of your website and what a user would see on your website. If there are discrepancies between the two, you’d know there was an issue to be fixed.

You can use robots.txt Tester to test your robots and see which web pages on your site are accessible to the user.

Once you have your sitemap loaded, Google will keep track of how many pages were uploaded in the sitemap. If there is a large discrepancy between pages submitted and pages indexed, you may want to be concerned.  

If you do not know what you are doing in regards to URL Parameters, do not mess around with this section. Have a professional handle any issues that may arise. A URL Parameter would be signified by a “?” in the URL. They are common with e-commerce, more so than any other industry, but are used anytime you want refinement on your search ability.

If there are any security issues, you can view them under Security Issues, but it is highly unlikely you ever have to use this section.

Getting Geeky – Advanced Tip:

Manual Actions
If there are manual actions that you need to take care of, you’ll want to hire somebody that has experience resolving them. These manual actions occur when somebody from Google personally goes into your website and determines that something is wrong. This is not a situation you want to mess up.

Common Mistakes

If you’re trying to remove a page of your website from the Google Index, checking “Blocked by robots” on the Index Status page of GSC will not do that. Checking “Blocked by robots” will only stop Google from crawling that page.

Additional Notes

If you are looking to learn more about the tools Google uses for the web, you can use the Webmaster Academy in the Google Search Console.

Additional Resources:

https://support.google.com/webmasters/answer/6001102?hl=en

Find out how Black Fin can help
with your free consultation now


Client Review

Rating:

"I cannot be happier with Gerrid Smith and Black Fin, and highly recommend their services."

- Seth Gladstein, Gladstein Law Firm