How to Fix Indexing Issues On Your Company’s Website


The internet is a vast library of websites and associated webpages. Google is the number one search engine with 92.47 percent market share of search traffic in June 2021. If your website isn’t being found by Google then chances are you are missing out on visibility of your business online. Read our latest blog on How to Fix Indexing Issues On Your Company's Website.

Your website is perhaps your organisation’s most vital asset. However, what’s the point of investing in web design services and top-notch SEO-optimised content if your customers can’t find it? 

While other search engines are available, if your site doesn’t appear in Google, it’s fair to say that it may as well not exist. What that means for you is that you need to be contained within their “index.” In other words, their repository of all valid and qualifying web pages on the internet. 

The problem? Sometimes a simple mistake can leave your website outside of their index and metaphorically nowhere to be found. We recently encountered this issue for a client, who, unbeknownst to them, had made a fairly common error that prevented their company website from appearing in Google’s search engine results. 

So, with that in mind, let’s walk you through how to ensure that your website is indexed correctly and show you how to fix some of the most common indexing errors. 

How to Check Your Site is Indexed? 

The first step of resolving any issues requires checking that your site is indexed. This process is straightforward, all you have to do is type “” into the Google search bar, and it will return all of the pages of your site currently listed in the index. No results will show up if your site isn’t indexed. 

You can also check your site’s indexation status with the Google Search Console by looking at the number of valid pages (with and without warnings). If the two numbers total anything but zero, you have some pages indexed. If it reads zero for both, then your entire site is unindexed. 

How to Index a Page of Piece of Content?

If you carried out either of the above methods and found that pages, blog posts, or other vital pieces of site content are missing, you can use the URL inspection tool within Google Search Console to submit it for indexing.

Follow these steps to index something you think is missing from the index:

  • Go to Google Search Console
  • Navigate to the URL inspection tool
  • Paste the URL you’d like Google to index into the search bar
  • Wait for Google to check the URL
  • Click the “Request indexing” button (if not already indexed)

But what if your entire site is missing? As was the case with our client, it’s likely a result of a setting within your site relating to the robots.txt file or meta tags. So let’s quickly cover what you need to do if that’s the case. 

Entire Site Missing from Google Index: Robots.txt Fix 

If Google is not indexing your entire website, it could be due to a crawl block in something called a robots.txt file.

To check for this issue, go to

Look for either of these two snippets of code:

User-agent: Googlebot

Disallow: /

User-agent: *

Disallow: /

Both of these lines of code tell Google’s crawl bots that they’re not allowed to crawl any pages on your site. To fix the issue, simply remove them. This may be the cause of specific pages not being crawled too. So make sure to check via the URL inspection tool for the “Crawl allowed? No: blocked by robots.txt” error message.

Fixing Site Indexing Issues with Tags 

Another common issue is specific tags on pages that actively tell Google’s crawl bots not to bother with certain pages. While this is useful for pages you don’t want to appear in search, it can be catastrophic if accidentally applied to the entire site, as was the case with our client. 

If you see either of these lines of code in your meta tags, then Google won’t index these pages:

Pages with either of these meta tags in their <head> section won’t be indexed by Google:

<meta name=“robots” content=“noindex”>

<meta name=“googlebot” content=“noindex”>

These are meta robots tags, and they tell search engines whether they can or can’t index the page.

You can use SEO tools to uncover pages with these tags automatically, or you can check manually by revisiting the URL inspection tool, clicking on the “Indexing allowed” button and seeing if it returns the error “No: ‘noindex’ detected in ‘X‑Robots-Tag’ HTTP header.”

Once again, simply remove these lines of codes from your header to fix the issue, or use an SEO plugin such as Yoast to implement these changes site-wide.

Ensure Your Website Can Be Seen By Checking its Indexation Status 

After investing so much into your website, the last thing you want is for it to remain hidden from the eyes of your potential customers. Having an unindexed site is akin to not having a site at all. Such is the influence of Google on search results these days. 

If you’re worried that critical pages of your site may be missing from the index, or you’re worried your entire site is yet to be indexed, then make sure to contact us at McGinn & Dolphin

We can conduct a thorough website audit and deliver a detailed report regarding your current website status. We can then immediately remediate any issues. Whether it’s problems with your robots.txt file, incorrect use of meta tags, or something else entirely, we can solve it for you. 

So why nort book a free, no-obligation discovery call today to discuss the issues you are experiencing with your website today?

Leave a Comment

Your email address will not be published. Required fields are marked *

Ali Dolphin is a UK-based marketing expert specialising in digital marketing. His expertise includes content writing, website design, and technology. Ali regularly provides insights and blogs on various aspects of digital marketing.

Scroll to Top