Possible Reasons Why Your Website Pages Are Not Being Correctly Indexed by Google, or Other Search Engines


Having your website as high up on the Google rankings as possible is important. However, not having pages indexed by Google can severely affect this ambition. Where this is the case you can often find yourself asking, “why are my pages not indexed by Google?” Firstly, you need to understand the basics behind how Google indexes pages.

 Crawling and Indexing

First of all, Google uses ‘Googlebot’ to ‘crawl’ billions of pages on the Internet and then indexes them. Search engine indexing is the process of collecting data and putting it in a relevant place to ensure that only relevant information is displayed when a user searches for a phrase or term. This is what makes Google so powerful. For you, it is vital that Google can index your pages so when users search for terms related to your business, your pages are displayed.

 Now you know how it works and the importance of it for you, it is time to look at what could prevent your pages being indexed correctly by Google.

Your Website

 Site Content

Content is king as it is fundamental to the user having a positive experience. Google rewards fresh, original content and in turn punishes duplicate, old or poor quality content. Therefore, if you find some of your pages are not being indexed this could be a possible reason. This includes not just visible content but also other important SEO factors such as title and description meta tags and brand and keyword URLs. These also attract search engine bots to your web pages and index them more frequently so is important that you get them right.

 Links

Having links on your page that redirect incorrectly will cause indexing errors. Therefore, it is important to check the links on your page work correctly. Another tip is having popular sites linking back to you. Google will frequently crawl and index these sites and by having your links on these sites, Google will be able to find, crawl and index your site quicker. It also adds further weight to your page ranking.

 Load Time

Having a slow website will effect the speed in which Google indexes your page. Check in with some free online tools like:  http://tools.pingdom.com/

 Password Protection

If any of your pages are password protected, it will prevent the search engine bots gaining access. This means that they can’t be crawled and ultimately won’t be indexed. So if a lot of your site is a paid members area, consider bulking out your site with pages NOT exclusively for paid members, to increase more pages being indexed.

Coding Issues

 One of the most common causes for pages not being indexed is coding issues within the page. Below are some of the most common.

 Robots.txt

This is a common reason for indexing errors, found located in the root of your website’s folder (WWW or public_html folder). You can add guidelines for bots such as any files or directories that you want to exclude. It is, therefore, essential that you double check if any pages that you want to be indexed have disallowed access to Google bots.

 .htaaccess File

This file needs to be correctly configured to prevent the formation of infinite loops resulting in loading errors and the page not being indexed.

 Cookies or Javascript

Cookies that need accepting and complex Javascript are difficult for the bots to read. If the bot can’t read the page and the links on it, then it won’t be able to index the page or the pages linked either. 

 Used Domain

If you have bought a domain that has been previously used, your web pages may have been de-indexed. A reason for this could be a history of spam. If this is the case, you will need to send a reconsideration request to Google.

 Server

If no new pages of your website have been indexed, then the time has come to check your server. This is because there is a chance the server may be blocking the search engine bots from accessing your website content. Things to consider here are whether your server is currently under maintenance or if the DNS delegations are affecting availability.

 Meta Tags

While meta tags can add quality SEO content to your website, ensure there are no specific ones blocking search engine bots and, therefore, leading to indexing errors. For example, some of your pages may be using the meta robots noindex tag that would lead to the page not being indexed:

 <meta name="robots" content="noindex, nofollow">

 Google Search Console: Configuration

Some URL parameters can be restricted from being indexed. This may lead to crawling and indexing errors such as your pages being de-indexed. For example, if there is duplicate content.

Check them out on the Google Search Console.

HTTP Status Code

 When Google crawls the page, it assigns it with an HTTP Status code. A quick and dirty rule of thumb is that most pages should have a status code of 200. If this is the case, Google can then proceed and index the page accordingly. However, on some occasions Google will allocate other codes such as 404. This code, for example, means your link leads to a place that cannot be found. If it can't be found, then it won't be indexed.

Not Using Web Tools to help identify indexing issues

 It is all very well knowing how to ensure pages are indexed by Google but if you aren't aware that they are currently not being indexed, then it's all a bit pointless. Here are a couple of tips to help you.

 A Quick Check

A quick way to see how many pages Google has indexed is to Type site:yoursite.com into Google. You can then compare this figure to the actual number of pages on your site. If there is a difference, this is something that can be investigated as there is clearly an issue preventing the page from being indexed (probably one of those above!).

 Get a Google Search Console (Formally Webmaster Tools) Account

Submit your site and you will be able to look at the crawl stats and the indexing status. You can also submit your XML sitemap, and it will inform you of any errors. You also have the option to ‘Fetch as Google' which asks Google to crawl pages with either new or updated content.  Read more about why sitemaps are important for seo.

 More SEO Tips...

Listed above are the vast majority of indexing errors that may have occurred. Now go ahead and check all these to ensure that crawling and indexing errors are kept to a minimum, and your page gets the search engine recognition it deserves.

Alternatively, get in touch with Kent SEO agency - Whitefish Marketing - and let us handle all your SEO requirements professionally.  Call us today on 01303 720 288 and let us show you how we can help improve your website's seo.


About Chris Surridge

Chris Surridge is an experienced Digital Marketing Director with a wealth of knowledge on Search Marketing Strategies and Conversion Analysis. His value is in strategic planning for client accounts, and his consultative services.