Why Are My Site Indexed Pages Being Dropped from Search?

Why Are My Site Indexed Pages Being Dropped from SearchIf your pages are not indexed, this could mean that Google does not “like” them or may be unable to crawl them easily.  Here are some possible reasons why your indexed page count has started to decrease:

  • Google doesn’t find your pages relevant.
  • Google has penalized you.
  • It’s not able to crawl your pages.

Here are some helpful tips on how to find and fix the problem of decreasing numbers of indexed pages.

  1. Are your pages loading properly?

You need to have the proper 200 HTTP Header Status.  Did the server experience frequent or prolonged downtime?  Did the domain recently expire, and you renewed it late?  Use the free HTTP Header Status tool to check to see its status.  You can also use Screaming Frog,DeepCrawl, Xenu, or Botify for huge sites.

  1. Did you recently change your URL?

Sometimes the change in the URLs of a site can occur due to a change in CMS, backend programming, or server setting that can result in a change of domain, subdomain, or folder.

Google can remember the old URLs, but if they don’t redirect, a lot of your pages can get de-indexed.  Hopefully, a copy of your old site still exists in some form with all the old URLs so that you can redirect to the corresponding URLs.

  1. Did you try to fix duplicate content?

Fixing duplicate content often includes canonical tags, 301 redirects, noindex meta tags, or disallowed robots.txt.  This is actually one case when a decrease in indexed URLs may be a good thing.  Since this is good for your website, the only thing you need to do is to double-check that this is what happened.

  1. Are the pages timing out?

Some servers have bandwidth restrictions because a higher bandwidth comes with a cost, and the site may need to be upgraded.  Sometimes, the issue is hardware, and it can be resolved by upgrading your hardware processing or memory.  Some sites block IP addresses when they access too many pages.  This is a way to avoid DDOS hacking attempts, but it can also hurt your site indexing. 

If this is a server bandwidth issue, it might be time to upgrade server services and packages.  If it is a server processing or a memory problem, besides upgrading the hardware, check if you have any type of server caching technology in place.  If an anti-DDOS software is the issue, you can decrease the settings or whitelist Googlebot.

  1. Do Google bots see your site differently?

Sometimes developers build sites without having a Google SEO strategy in mind.  At times, a preferred CMS will be used without it being search engine friendly.  Sometimes, an SEO services agency might have tried to do content cloaking to attempt to play the search engines.  Other times, the website has been tampered by hackers, who can cause a different page to be shown to Google to promote their own links. 

The worst-case scenario would be pages that are infected with some form of malware that would cause Google to deindex them automatically.  To see if Googlebot is seeing the same page as you are, use the Google Search Console’s fetch and render feature.  You can also try to translate the page in Google Translate or check Google’s Cached page, but there can also be ways around these to cloak content behind them.

The point of looking at indexed pages is usually to see if Google can crawl and index your pages properly, not as a key performance indicator (KPI), which measures the success of your search engine optimization services agency.

Fixing duplicate content or poor-quality content may also result in a decreased number of indexed pages, but it’s still a good thing.

If you want the best search engine optimization services, don’t hesitate to contact me. I’d love to discuss your marketing strategy and help in any way I can.

Rob Dunford is a Marketing Consultant in the Great Toronto Area with over 25 years of experience in implementing marketing plans for small businesses.