Chapter 1:Getting Started With the Google Search Console
In this chapter Iβll show you how to use the Search Console. First, youβll learn how to add your site to the GSC. Then, Iβll help you make sure your site settings are good to go.
Grab your sitemap URL. Then, hit the βSitemapsβ button.
Paste in your URL and click βSubmitβ.
And thatβs it:
Told you it was easy π
Chapter 2:How to Optimize Your Technical SEO With the GSC
In this chapter I’ll share the tactics I use to SLAM DUNK my technical SEO.
As you know, when you fix these technical SEO problems, youβll usually find yourself with higher rankings and more traffic.
And the Google Search Console has a TON of features to help you easily spot and fix technical SEO issues.
Hereβs how to use them:
Use The βIndex Coverageβ Report To Find (And Fix) Problems With Indexing
If everything on your website is setup right, Google will:
a) Find your page
and
b) Quickly add it to their index
But sometimes, things go wrong.
Things you NEED to fix if you want Google to index all of your pages.
And thatβs where the Index Coverage report comes in.
Letβs dive in.
What is the Index Coverage Report?
The Index Coverage report lets you know which pages from your site are in Googleβs index. It also lets you know about technical issues that prevent pages from getting indexed.
Itβs part of the new GSC and replaces the βIndex Statusβ report in the old Search Console.
Note: The Coverage report is pretty complicated.
And I could just hand you a list of features and wish you luck.
(In fact, thatβs what most other βultimate guidesβ do).
Instead, Iβm going to walk you through an analysis of a REAL site (this one), step-by-step.
That way you can watch me use the Index Coverage Report to uncover problems… and fix them.
How to Find Errors With The Index Coverage Report
At the top of the Index Coverage report weβve got 4 tabs:
Error
Valid with warnings
Valid
Excluded
Letβs focus on the βerrorβ tab for now.
As you can see, this site has 54 errors. The chart shows how that number has changed over time.
If you scroll down, you get deets on each of these errors:
Thereβs a lot to take in here.
So to help you make sense of each βreasonβ, here are some quick definitions:
βSubmitted URL seems to be a Soft 404β
This means that the page was βnot foundβ, but delivered an incorrect status code in the header.
(Iβve found this one to be a little buggy)
βRedirect errorβ
Thereβs a redirect for this page (301/302).
But it ainβt working.
βSubmitted URL not found (404)β
The page wasnβt found and the server returned the correct HTTP status code (404).
All good. (Well, if you ignore the fact that the page is broken…)
βSubmitted URL has crawl issueβ
This could be a 100 different things.
Youβll have to visit the page to see whatβs up.
βServer errors (5xx)β
Googlebot couldnβt access the server. It might have crashed, timed out, or been down when Googlebot stopped by.
And when you click on an error status, you get a list of pages with that particular problem.
404 errors should be easy to fix. So letβs start with those.
Click a URL on the list. This opens up a side panel with 4 options:
But first, letβs visit the URL with a browser. That way, we can double check that the page is really down.
Yup. Itβs down.
Next, pop your URL into the URL inspection field at the top of the page.
And Googlebot will rush over to your page.
Sure enough, this page is still giving me a 404 βNot foundβ status.
How do we fix it?
Well, we have two options:
Leave it as is. Google will eventually deindex the page. This makes sense if the page is down for a reason (like if you donβt sell that product anymore).
You can redirect the 404 page to a similar product page, category page, or blog post.
How to Fix βSoft 404β Errors
Now itβs time to fix these pesky βSoft 404β errors.
Again, check out the URLs with that error.
Then, visit each URL in your browser.
Looks like the first page on the list is loading fine.
Letβs see if Google can access the page OK. Again, weβll use the URL Inspection tool.
This time weβll hit the βTest Live URL buttonβ. This sends Googlebot to the page. It also renders the page so you can see your page like Googlebot sees it.
Looks like Google found the page this time.
Now letβs see how Google rendered the page. Click “View Tested Page”, then the “Screenshot” tab:
Looks pretty much the same as how visitors see it. Thatβs good.
Next, click the More Info tab, and check for any page resources that Google wasn’t able to load correctly.
Sometimes thereβs a good reason to block certain resources from Googlebot. But sometimes these blocked resources can lead to soft 404 errors.
In this case though, these 5 things are all meant to be blocked.
Once you’ve made sure any indexing errors are resolved, click the “Request Indexing” button:
This tells Google to index the page.
The next time Googlebot stops by, the page should get indexed.
How to Fix Other Errors
You can use the same exact process I just used for βSoft 404sβ to fix any error you run into:
Load up the page in your browser
Plug the URL into βURL Inspectionβ
Read over the specific issues that the GSC tells you about
Fix any issues that crop up
Here are a few examples:
Redirect errors
Crawl errors
Server errors
Bottom line? With a bit of work, you can fix pretty much any error that you run into in the Coverage report.
How to Fix βWarningsβ In The Index Coverage Report
I donβt know about you…
…but I donβt like to leave anything to chance when it comes to SEO.
Which means I donβt mess around when I see a bright orange βWarningβ.
So letβs hit the βValid with warningsβ tab in the Index Coverage Report.
This time thereβs just one warning: βIndexed, though blocked by robots.txtβ.
So whatβs going on here?
Letβs find out.
The GSC is telling us the page is getting blocked by robots.txt. So instead of hitting βFetch As Googleβ, click on βTest Robots.txt Blockingβ:
This takes us to the robots.txt tester in the old Search Console.
As it turns out, this URL IS getting blocked by robots.txt.
So whatβs the fix?
Well, if you want the page indexed, you should unblock it from Robots.txt (duh).
But if you donβt want it indexed, you have two options:
Add the βnoindex,followβ tag to the page. And unblock it from robots.txt
Get rid of the page using the URL Removal Tool
Letβs see how to use the URL Removal Tool:
How To Use The URL Removal Tool In Search Console
The URL Removal Tool is a quick and easy way to remove pages from Googleβs index.
Unfortunately, this tool hasnβt moved over to the new Google Search Console yet. So youβll need to use the old GSC to use it.
Expand the “Legacy tools and reports” tab in the new GSC sidebar, then click “Removals”, where you’ll be taken to the old GSC.
Finally, paste in the URL you want to remove:
Double triple check that you entered the right URL, then click βSubmit Requestβ.
Note: A removal is only active for 90 days. After that Googlebot will attempt to recache the page.
But considering the page is blocked through robots.txt…
…this page will be gone for good!
Check Indexed Pages For Possible Issues
Now letβs move on to the βValidβ tab.
This tells us how many pages are indexed in Google.
What should you look for here? Two things:
1
Unexpected drop (or increase) of indexed pages
Notice a sudden drop in the number of indexed pages?
That could be a sign that somethingβs wrong:
Maybe a bunch of pages are blocking Googlebot.
Or maybe you added a noindex tag by mistake.
Either way:
Unless you purposely deindexed a bunch of pages, you definitely want to check this out.
On the flip side:
What if you notice a sudden increase in indexed pages?
Again, that might be a sign that something is wrong.
(For example, maybe you unblocked a bunch of pages that are supposed to be blocked).
2
An unexpectedly high number of indexed pages
There are currently 41 posts at Backlinko.
So when I take a look at the βValidβ report in Index Coverage, Iβd expect to see about that many pages indexed.
But if itβs WAY higher than 41? Thatβs a problem. And Iβm going to have to fix it.
Oh, in case youβre wondering… hereβs what I do see:
So no need to worry about me π
Make Sure Excluded Stuff Should Be Excluded
Now:
There are plenty of good reasons to block search engines from indexing a page.
Note: When I say βlow qualityβ, I donβt mean the page is garbage. It could be that the page is useful for users… but not for search engines.
That said:
You definitely want to make sure Google doesnβt exclude pages that you WANT indexed.
In this case, we have a lot of excluded pages…
And if you scroll down, you get a list of reasons that each page is excluded from Googleβs index.
So letβs break this down…
βPage with redirectβ
The page is redirecting to another URL.
This is totally fine. Unless there are backlinks (or internal links) pointing to that URL, theyβll eventually stop trying to index it.
βAlternate page with proper canonical tagβ
Google found an alternative version of this page somewhere else.
Thatβs what a canonical URL is supposed to do. So thatβs A-OK.
βCrawl Anomalyβ
Yikes! Could be a number of things. So weβll need to investigate.
In this case, it looks like the pages listed are returning a 404.
βCrawled – currently not indexedβ
Hmmm…
These are pages that Google has crawled, but (for some reason) are not indexed.
Google doesnβt give you the exact reason they wonβt index the page.
But from my experience, this error means: the page isnβt good enough to warrant a spot in the search results.
So, what should you do to fix this?
My advice: work on improving the quality of any pages listed.
For example, if itβs a category page, add some content that describes that category. If the page has lots of duplicate content, make it unique. If the page doesnβt have much content on it, beef it up.
Basically, make the page worthy of Googleβs index.
βSubmitted URL not selected as Canonicalβ
This is Google telling you:
βThis page has the same content as a bunch of other pages. But we think another URL is betterβ
So theyβve excluded this page from the index.
My advice: if you have duplicate content on a number of pages, add the noindex meta robots tag to all duplicate pages except the one you want indexed.
βBlocked by robots.txtβ
These are pages that robots.txt is blocking Google from crawling.
Itβs worth double checking these errors to make sure what you’re blocking is meant to be blocked.
If itβs all good? Then robots.txt is doing its job and thereβs nothing to worry about.
βDuplicate page without canonical tagβ
The page is part of set of duplicate pages, and doesnβt include a canonical URL.
In this case itβs pretty easy to see whatβs up.
Weβve got a number of PDF documents. And these PDFs contain content from other pages on the site.
Honestly, this isnβt a big deal. But to be on the safe side, you should ask your web developer to block these PDFs using robots.txt. That way, Google ONLY indexes the original content.
βDiscovered – currently not indexedβ
Google has crawled these pages, but hasnβt included them in the index yet.
βExcluded by βnoindexβ tagβ
All good. The noindex tag is doing its job.
So thatβs the Index Coverage report. Iβm sure youβll agree: itβs a VERY impressive tool.
Chapter 3:Get More Organic Traffic with the Performance Report
In this chapter weβre going to deep dive into my favorite part of the GSC: βThe Performance Reportβ.
Why is it my favorite?
Because Iβve used this report to increase organic traffic to Backlinko again and again.
Iβve also seen lots of other people use the Performance Report to get similar results.
So without further ado, letβs get started…
What Is The Performance Report?
The βPerformanceβ report in Google Search Console shows you your siteβs overall search performance in Google. This report not only shows you how many clicks you get, but also lets you know your CTR and average ranking position.
And this new Performance Report replaces the βSearch Analyticsβ report in the old Search Console (and the old Google Webmaster Tools).
Yes, a lot of the data is the same as the old βSearch Analyticsβ report. But you can now do cool stuff with the data you get (like filter to only show AMP results).
But my favorite addition to the new version is this:
In the old Search Analytics report you could only see search data from the last 90 days.
(Which sucked)
Now?
We get 16 MONTHS of data:
For an SEO junkie like me, 16 months of data is like opening presents on Christmas morning.
(In fact, I used to pay for a tool to automatically pull and save my old Google Webmaster Tools data. Now, thanks to the beta version of the new GSC, itβs a free service)
How To Supercharge Your CTR With The Performance Report
Note: Like I did in the last chapter, Iβm going to walk you through a real-life case study.
Last time, we looked at an ecommerce site. Now weβre going to see how to use the GSC to get more traffic to a blog (this one).
Specifically, youβre going to see how I used The Performance Report to increase this siteβs CTR by 63.2%.
So letβs fire up the Performance report in the new Search Console and get started…
1
Find Pages With a Low CTR
First, highlight the βAverage CTRβ and βAverage Positionβ tabs:
You want to focus on pages that are ranking #5 or lower⦠and have a bad CTR.
So letβs filter out positions 1-4.
To do that, click on the filter button, and check the βPositionβ box.
Youβll now see a filter box above the data. So we can go ahead and set this to βGreater thanβ 4.9:
Now you have a list of pages that are ranking #5 or below.
According to Advanced Web Ranking, position #5 in Google should get a CTR of around 4.35%:
You want to filter out everything thatβs beating that expected CTR of 4.35%. That way you can focus on pages that are underperforming.
So click the filter button again and check the βCTRβ box.
(Make sure you leave the βPositionβ box ticked)
Then, set the CTR filter to βSmaller thanβ 4.35.
So what have we got?
A list of keywords that are ranking 5 or lower AND have a CTR less than 4.35%.
In other words:
Keywords you could get more traffic from.
We just need to bump up their CTR.
So:
Letβs see if we can find a keyword with a lower-than-expected CTR.
When I scroll down the list… this keyword sticks out like a sore thumb.
1,504 impressions and only 43 clicks⦠ouch! I know that I can do better than 2.9%.
Now that weβve found a keyword with a bad CTR, itβs time to turn things around.
2
Find the page
Next, you want to see which page from your site ranks for the keyword you just found.
To do that, just click on the query with the bad CTR. Then, click βPagesβ:
Easy.
3
Take a look at ALL the keywords this page ranks for
Thereβs no point improving our CTR for one keywordβ¦ only to mess it up for 10 other keywords.
So hereβs something really cool:
The Performance report can show you ALL keywords that your page ranks for.
And itβs SUPER easy to do.
Just click on β+ Newβ in the top bar and hit βpage…β.
Then enter the URL you want to view queries for.
Bingo! You get a list of keywords that page ranks for:
You can see that the page has shown up over 42,000 times in Google…but only got around 1,500 clicks.
So this pageβs CTR is pretty bad across the board.
(Not just for this particular keyword)
4
Optimize your title and description to get more clicks
I have a few go-to tactics that I use to bump up my CTR.
But my all time favorite is: Power Words.
What are power words?
Power words show that someone can get quick and easy results from your content.
And theyβve been proven again and again to attract clicks in the SERPs.
Here are a few of my favorite Power Words that you can include in your title and description:
Today
Right now
Fast
Works quickly
Step-by-step
Easy
Best
Quick
Definitive
Simple
So I added a few of these Power Words to the pageβs title and description tag:
5
Monitor the results
Finally, wait at least 10 days. Then log back in.
Why 10 days?
It can take a few days for Google to reindex your page.
Then, the new page has to be live for about a week for you to get meaningful data.
With that, I have great news:
With the new Search Console, comparing CTR over two date ranges is a piece of cake.
Just click on the date filter:
Select the date range. Iβm going to compare the 2 week period before the title change, to the 2 weeks after:
Finally, filter the data to show search queries that include the keyword you found in step #1 (in this case: βbest helmet brandsβ).
Boom!
Weβve increased our CTR by 63.2%. And just as important: weβre now beating the average CTR for position 5.
Pro tip: Youβll find that different title formats work better in different niches. So you might have to experiment to find the perfect format for YOUR industry. The good news: Search Console gives you the data you need to do just that.
How To Find βOpportunity Keywordsβ With GSCβs Performance Report
If the last example didnβt convince you of just how awesome the new Performance Report is, then I guarantee this one will.
What Is An Opportunity Keyword?
An opportunity keyword is a phrase that ranks between positions 8-20 AND gets a decent number of impressions.
Why is this such a big opportunity?
1
Google already considers your page to be a decent fit for the keyword (otherwise you wouldnβt be anywhere close to page 1). When you give your page some TLC, you can usually bump it up to the first page.
2
Youβre not relying on iffy keyword volume data from third party SEO tools. The impression data you get from the GSC tells you EXACTLY how much traffic to expect.
Mining For Gold With Google Search Consoleβs Performance Report
Finding these gold nugget keywords in the Performance report is a simple, 3-step process.
1. Set the date range to the last 28 days:
2. Filter the report to show keywords ranking βGreater thanβ 7.9
3. Finally, sort by βImpressionsβ. And you get a huge list of βOpportunity Keywordsβ:
The best part? These answers give you a shot to rank as a Featured Snippet.
After all: why rank #1 when you can rank #0?
Find High-Impression Keywords
I already showed you how to optimize keywords that rank 8-20.
But…
I also like to look for keywords that arenβt ranking, yet still get some impressions. Hereβs an example:
That keyword is sitting at position 50-ish… yet the page was still seen nearly 200 times.
Which tells me: if that many people are visiting the 5th page, wait until I hit the first page.
Itβs gonna be nuts!
Chapter 4:Cool GSC Features
In this chapter Iβm going to show you some of the coolest features in the Google Search Console.
First, Iβll teach you how you can use the Search Console to fix your schema.
Then, Iβll show you one of the quickest (and EASIEST) wins in SEO.
Power Up Important Pages With Internal Links
Make no mistake:
Internal links are SUPER powerful.
Unfortunately, most people use internal linking all wrong.
Thatβs the bad news.
The good news?
The Search Console has an awesome feature designed to help you overcome this problem.
This report shows you the EXACT pages that need some internal link love.
To access this report, hit βLinksβ in the GSC sidebar.
And youβll get a report that shows you the number of internal links pointing to every page on your site.
This report is already a goldmine.
But it gets better…
You can find the EXACT pages that internally link to a specific page. Just click on one of the URLs under the βInternal Linksβ section:
And youβll get a list of all the internal links pointing to that page:
In this case, we only have 6 internal links pointing to our Local SEO Guide. Thatβs not good.
So:
Once you find a page that doesnβt have enough internal links juice, add some internal links that point to that page.
Time spent: under a minute.
Assessment: Win!
Pro Tip: Supercharge Key Posts With Internal Links From Powerhouse Pages
Whatβs a Powerhouse Page?
Itβs a page on your site with lots of quality backlinks.
More backlinks = more link juice to pass on through internal links.
You can easily find Powerhouse Pages in the Google Search Console.
Just hit the βLinksβ button again. And youβll see a section titled βTop linked pagesβ.
Click βMoreβ for a full list.
By default, the report is ordered by the total number of backlinks. But I prefer to sort by number of linking sites:
These are your Powerhouse Pages.
And all you need to do is add some internal links FROM those pages TO the ones you want to boost.
Easy, right?
Chapter 5:Advanced Tips and Strategies
Now itβs time for some advanced tips and strategies.
In this chapter youβll learn how to use Google Search Console to optimize crawl budget, fix issues with mobile usability, and improve your mobile CTR.
Mastering Crawl Stats
If you have a small site (<1,000 pages), you probably donβt need to worry about crawl stats.
But if you have a huge siteβ¦ thatβs a different story.
In that case, itβs worth looking into your crawl budget.
What Is Crawl Budget?
Your Crawl Budget is the number of pages on your site that Google crawls every day.
You can still see this number in the old βCrawl Statsβ report.
In this case, Google crawls an average of 22,257 pages per day. So thatβs this siteβs Crawl Budget.
Why Is Crawl Budget Important For SEO?
Say you have:
200,000 pages on your website
and
A crawl budget of 2,000 pages per day
It could take Google 100 days to crawl your site.
So if you change something on one of your pages, it might take MONTHS before Google processes the change.
Or, if you add a new page to your site, Googleβs going to take forever to index it.
So what can you do to get the most out of your Crawl Budget?
Three things…
1
First, stop wasting Crawl Budget on unnecessary pages
This is a biggie for Ecommerce sites.
Most ecommerce sites let their users filter through products⦠and search for things.
This is great for sales.
But if youβre not careful, you can find yourself with THOUSANDS of extra pages that look like this:
Unless you take action, Google will happily waste your crawl budget on these junk pages.
Whatβs the solution?
URL Parameters.
To set these up, click the βURL Parametersβ link in the old GSC. Then hit βAdd Parameterβ.
Letβs say that you let users filter products by color. And each color has its own URL.
For example, the color URLs look like this:
yourstore.com/product-category/?color=red
You can easily tell Google not to crawl any URLs with that color parameter:
Repeat this for ALL parameters you donβt want Google to crawl.
And if youβre somewhat new to SEO, check in with an SEO specialist to make sure this is implement correctly. When it comes to parameters, itβs easy to do more harm than good!
2
See how long it takes Google to download your page
The crawl report in Search Console shows you the average time it takes Google to download your pages:
See that spike? It means that it suddenly took Google A LOT longer to download everything.
And this can KILL your Crawl Budget.
In fact, we have this quote straight from the horseβs mouth…
βMaking a site faster improves the users’ experience while also increasing crawl rate. For Googlebot a speedy site is a sign of healthy servers, so it can get more content over the same number of connections. On the flip side, a significant number of 5xx errors or connection timeouts signal the opposite, and crawling slows down.”
Bottom line? Make sure your site loads SUPER fast. You already know that this can help your rankings.
As it turns out, a fast-loading site squeezes more out of your crawl budget too.
3
Get more backlinks to your site
As if backlinks couldnβt be any more awesome, it turns out that they also help with your crawl budget.
βThe best way to think about it is that the number of pages that we crawl is roughly proportional to your PageRank. So if you have a lot of incoming links on your root page, weβll definitely crawl that. Then your root page may link to other pages, and those will get PageRank and weβll crawl those as well. As you get deeper and deeper in your site, however, PageRank tends to decline.”
The takeaway:
More backlinks = bigger crawl budget.
Get The Most Out of βURL Inspectionβ
I already covered the URL Inspection tool in Chapter 3.
But that was one part of a big process. So letβs take a look at URL Inspection as a standalone tool.
Specifically, Iβm going to show you 3 cool things you can do with the Fetch As Google tool.
1
Get new content indexed (in minutes)
URL Inspection is the FASTEST way to get new pages indexed.
Just published a new page?
Just pop the URL into the box and press Enter.
Then hit “Request Indexing”…
…and Google will normally index your page within a few minutes.
2
Use βURL Inspectionβ to reindex updated content
If youβre a regular Backlinko reader, you know that I LOVE updating old content.
I do it to keep my content fresh. But I also do it because it increases organic traffic (FAST).
For example, in this case study, I reveal how relaunching an old post got me 260.7% more organic traffic in just 14 days.
And you better believe I always use the βFetch As Googleβ tool to get my new content indexed ASAP.
Otherwise, I have to wait around for Google to recrawl the page on its own.
As Sweet Brown famously said: βAinβt nobody got time for that!β.
3
Identify Problems With Rendering
So what else can the βURL Inspectionβ tool do?
βTest Live URLβ shows you how Google and users see your page.
You just need to hit the βView Tested Pageβ button.
Then hit βScreenshotβ. And youβll see exactly how Google sees your page.
Make Sure Your Site Is Optimized For Mobile (Unless You Like Losing Traffic)
As you might have heard, more people are searching with their mobile devices than with desktops.
Bottom line? Your siteβs content and UX has to be 100% optimized for mobile.
But how do you know if Google considers your site mobile optimized?
Well, the Google Search Console has an excellent report called βMobile Usabilityβ. This report tells you if mobile users have trouble using your site.
Hereβs an example:
As you can see, the report is telling us about two mobile usability issues: βText too small to readβ and βClickable elements too close togetherβ.
All you need to do is click on one of the issues. And the GSC will show you:
1. Pages with this issue
2. How to fix the problem
Then, itβs just a matter of taking care of that issue.