Have you checked your Web analytics just to find out your site has lost a high share of its organic search traffic in the last days? Don’t panic! Here’s a checklist that will help you to identify the reason for it:
A Web analytics tracking problem?
Verify if it’s a generalized loss happening to all of your traffic channels and areas of your site. If it’s a generalized drop, first check that your site is (and has been) online and correctly working. If so, then validate your Google Analytics Web tracking setup as it could be due to an error there.
A seasonality or search trend change?
Check if the loss is due to seasonality or a search trend change by:
- Verifying if the organic search traffic drop doesn’t coincide with a loss in your organic search rankings (but just to a lower volume of relevant searches). In case you don’t directly track your Google’s rankings, use Google Webmaster Tools “Search Queries” section, SEMrush or Sistrix Visibility Index.
- Comparing your current traffic trend with the ones of previous years using your Web Analytics, to see if they coincide. If you don’t have historic data, use the Google Keyword Planner “Search volume trends” feature or Google Trends.
A content accessibility & crawlability issue?
Verify if your site pages & relevant content are accessible to both users and Google’s search bots -and if the traffic and rankings loss is specific to certain areas of your site, prioritize them to do this validation- by:
- Checking if you’re blocking your site (or the areas where you have lost traffic & rankings) to Google search bots with the Robots.txt. You can use the Google Webmaster Tools Robots Testing Tool.
- Verifying in the Google Webmaster Tools “Crawl Errors” area if Google search bots have recently found a high number of site (Server, DNS) or URLs errors.
- Browsing directly to your site pages to verify if they’re accessible, not redirected towards other addresses and they’re showing the correct 200 OK HTTP status code. You can use the “Redirect Path” Chrome Extension for this.
- Simulating Google search bots (desktop & smartphone) crawling & rendering with Google Webmaster Tools “Fetch as Google” functionality to check the most important pages -and those that have specifically lost traffic & rankings- with crawlers such as ScreamingFrog or Deepcrawl.
- Analyzing your server logs to detect potential crawling issues coinciding with the areas suffering from traffic & rankings drop. You can use Kibana, Splunk or Botify Log Analyzer.
A content indexation issue?
See if Google is not correctly indexing your pages by searching in Google to see if your site and specifically those pages or areas with traffic & rankings loss, are still indexed by using the “site:” search operator. You can also check if there has been a decrease in your site number of indexed URLs with Google Webmaster Tools “Index Status” report.
- If your pages are not indexed:
- Verify if you have accidentally no-indexed your site pages by including a noindex in the robots meta tag or x-robots-tag in the HTTP headers. You can check this by using crawlers such as ScreamingFrog and DeepCrawl.
- Check if your pages have been wrongly canonicalized by not including the original URLs in the rel=”canonical” link element in your HTML <head> section, XML sitemaps or in the HTTP headers.
- Make sure you haven’t requested the removal of your pages with Google Webmaster Tools “Remove URLs” functionality.
- If your pages are still indexed:
- Validate their cached versions (going also to their “text-only” version) to verify that Google is correctly identifying your content.
A Google Webmaster Tools misconfiguration?
Revise that you haven’t changed your Google Webmaster Tools settings as there are features that if misconfigured could negatively affect your existing organic search visibility, such as:
- Setting an incorrect preference for URLs parameters.
- Disavowing a high volume of popular backlinks.
- Setting an incorrect International Country Targeting.
A Web security problem?
Verify if Google has detected any security problems (due to malware or spam) by checking the Google Webmaster Tools “Security Issues” area. If so, you should follow the process specified here to fix the issues and request for a review.
A manual penalization?
If the traffic loss coincides with a rankings drop, check if your site has been manually penalized by not following Google’s Webmaster guidelines -the reasons can go from having unnatural incoming links, keyword stuffing, content with little value to cloaking- in the “manual actions” section of Google Webmaster Tools.
If so, the type of action, the reason and areas of the site affected will be provided there and you’ll need to take the appropriate “clean-up” or optimization steps before asking for a reinclusion review.
A Google Search Algorithm Update?
Check the correlation of Google’s algorithm updates with your organic search traffic & rankings loss. You can use Moz Google Algorithm Change History, AWR Google Algorithm Changes and The Panguin Tool. If your loss coincides with:
- A Panda update -focused on content quality & relevance-, follow the process described in these analysis and audit guides along this case study. You can also use tools like DeepCrawl & Siteliner to facilitate the assessment of your content.
- A Penguin update -focused on incoming links quality & relevance-, follow the process described in these analysis and audit guides along this case study. You can also use tools like LinkRisk to facilitate the assessment of your links.
Have you followed the checklist?
You should have found the cause of your organic search traffic loss then. Congrats!
Be prepared next time by bookmarking & downloading the checklist as a PDF.