The 2017 Edition
Have you checked your Web analytics just to find out your site has lost a high share of its organic search traffic in the last days? Don’t panic! Here’s a checklist that will help you to identify the cause:
A tracking problem?
Verify if your site is suffering from a real general traffic loss happening to all of your channels and pages, or specific one per area or device caused by configuration issues by checking that:
- Your site is (and has been) online and correctly working, or if it might have been down due to technical issues. You can be alerted when this happens by using an “uptime service” such as: Pingdom Uptime Monitoring, UptimeRobot, or Little Warden.
- Google Analytics Web tracking setup is correct, as it could be due to a configuration error too, or that you have launched new areas, pages or functionalities, such as AMP or an app, without correctly tracking them, for example.
Make sure that your site is running smoothly and all your Web (and app) properties, including new ones are correctly tracked before moving on.
A search behavior change?
Verify if there has been a loss or a change of user search behavior due to seasonality, usage trend or change in the search results features? Check it by:
- Comparing your current traffic trend with the ones of previous years by using your Web Analytics, to see if they coincide. If you don’t have historic data, use the Google Keyword Planner “Search volume trends” feature or the trend shown in most of keyword tools, such as KWFinder.
- Seeing if your site pages CTR in search results has changed because of a loss of SERP features visibility -although not necessarily of rankings- by taking a look at your Google Search Console “Search Analytics” CTR trend over time, as well as rich results metrics under the “search appearance” option. Compare yours with the ones of your competitors by using SEMrush SERP features.
Before panicking over a potential rankings loss, verify if the negative traffic trend is due to less users clicks in search results due to a different search behavior.
A real drop in rankings?
If it hasn’t been a tracking issue or a change in search behavior, it’s then time to check that the traffic drop is due to a real negative change in rankings and if so, where and how has affected your site by:
- Verifying your rankings per query, pages and device type with Google Search Console “Search Analytics” section, SEMrush, SearchMetrics or Sistrix Visibility in case you don’t specifically monitor your rankings with tools like SEOmonitor (which is also recommended to do).
- Comparing your rankings vs. the ones of your competitors to identify if they have also lost them too, and see which of them have been the winners of your loss.
If the result of this analysis is positive and you see a real drop in rankings, then you can move on to analyze the cause of it to fix it.
A content accessibility & crawlability issue?
Have you updated your site Web structure or migrated it towards HTTPS or another domain without updating your site configuration accordingly (here’s a checklist to help you with HTTPs migrations and steps to follow if you haven’t done a Web migration well) and now Google can’t find your pages?
Verify if the loss is due to your Web accessibility issues to users and/or Google’s bots and if it is specific to certain areas of your site or devices or general to all of it, by checking your organic search traffic activity per Website area (categories, sub-categories, etc.) and devices (mobile, tablet, desktop) with Google Analytics reports and segments, to see if the loss is specific to any of them or equal for all.
Then check the following accordingly:
- That your critical site pages are accessible to your desktop and mobile users, checking if they’re getting errors or redirected towards others. You can use Chrome’s DevTools to simulate different devices and check the pages HTTP status, or use Chrome’s Link Redirect Trace Extension or Chrome DevTools for this.
- If you’re blocking your site to searchbots:
- Verifying the configuration of your Robots.txt, for which you can use Google Search Console Robots Testing Tool.
- If there’s a higher number of errors by checking Google’s Search Console “Crawl Errors” reports for desktop as well as smartphones.
- If there are issues fetching and rendering your pages content by validating your top pages with Google Search Console’s “Fetch as Google” and selecting the “Fetch and Render” functionality, with both the desktop and smartphone bots.
- Simulating Google search bots (desktop & smartphone) crawling your site with tools like ScreamingFrog, Deepcrawl, Botify, Ryte and OnCrawl.
- Checking your Web server logs to look for issues and identify correlation with the drop times, areas and devices and/or gaps with your own crawling and traffic with tools like Screaming Frog Log Analyzer, Botify Log Analyzer, OnCrawl or Loggly.
A content indexation issue?
- If your pages are not indexed:
- Verify if your site pages have been erroneously noindexed by including a noindex in the robots meta tag or x-robots-tag in the HTTP headers or if they have been erroneously canonicalized to others, by not including the original URLs in the rel=”canonical” link element in your HTML <head> section or in the HTTP headers.. You can check this by crawling your site with search crawlers as ScreamingFrog, Deepcrawl, Botify, Ryte and OnCrawl.
- Make sure you haven’t requested the removal of your pages with Google Search Console “Remove URLs” functionality.
- If your pages are still indexed:
- Check their content by verifying their cached versions (going also to their “text-only” version) and see if there’s a gap with the content shown there as well as the rendered with tools like Fetch & Render as any bot, Google Search Console Fetch and Render as well as by checking their DOM with Chrome DevTools.
A Google Search Console misconfiguration?
Revise that you haven’t changed your Google Search Console settings as there are features that if misconfigured could negatively affect your existing organic search visibility, such as:
- Setting an incorrect preference for URLs parameters.
- Disavowing a high volume of popular backlinks.
- Setting an incorrect International Country Targeting.
A Web security problem?
Verify if Google has detected any security problem (due to malware or spam) by checking the Google Search Console “Security Issues” area. If so, you should follow the process specified here to fix the issues and request for a review.
A manual penalization?
If the traffic loss coincides with a rankings drop, check if your site has been manually penalized by not following Google’s Webmaster Guidelines -the reasons can go from having unnatural incoming links, keyword stuffing to cloaking- by revising the “manual actions” section of Google Search Console.
If so, the type of action, the reason and areas of the site affected will be provided there and you’ll need to take the appropriate clean-up or optimization steps before asking for a reinclusion review. If it’s a link related penalization you can use tools like Kerboo, Link Research Tools and CognitiveSEO to do it.
A Google Search Algorithm Update?
Again, if the traffic loss coincides with a rankings drop, check the correlation of Google’s algorithm updates with it. In order to do it so you can use:
Have you followed the checklist?
You should have found the cause of your organic search traffic loss then. Congrats!
Be prepared next time by bookmarking this page.