Google Indexing is vital because if you are not indexed, you can’t be found. It doesn’t matter how good your optimization efforts are if Google isn’t even looking at your content!
But how do we persuade Google to index our site? And what do we do when part of our site is indexed, but not everything?
Google Dropped My Site
This is a screenshot for chrisg.com after I started repairing the damage. Yeah, not pretty!
Unless you regularly check in Google Search console, however, you might not be aware of how your index coverage even looks because a partial problem means you still get found, you still show in Google searches when you do a site: or brand search, you are just not getting the full visibility you could be getting.
Why Google Drops You From Their Index
Google can stop or slow indexing for multiple reasons but the main ones are:
- Hosting problems – If your server or website has speed or downtime issues then Google basically gives up trying.
- Broken/missing sitemap – In one of the cases for a Blog SEO Tactics member, their sitemap was actually altered by a hacker to point to spam.
- Technical issues – This comes under “technical SEO” but don’t let the name put you off, a lot of it is general website maintenance such as broken links, bad code, errors in redirects and so on.
In my case I had a mixture of all three issues, but also my site here is hugely bloated because of how old it is. I registered my domain in 1998!
Having extra cruft on your site means that Google might be attempting to index but runs out of the allotted resources they allocated to you (“crawl budget”).
How Indexing Works
Indexing is older than search engines, but outside of database engineers and computer science nobody really cared as much until the World Wide Web started getting popular. You can think of an index as the card files in a library, or the index at the back of a text book.
Back in the day “getting indexed” was essentially registering your site with a web directory and then hoping they accepted you, and put you in a prominent category.
For Google, indexing is the process by which the search engine bot collects, processes, and stores data for later use by users requesting data from its search engine front end.
Search engines find your content through links, primarily, though submitting a sitemap is a necessary step too.
Here’s a breakdown of how Google works:
- Crawling: Google uses “crawlers” or “spiders” to discover publicly available webpages. Crawlers are software that go and look at web pages and follow links on those pages, much like you would if you were browsing content on the web, clicking on links.
- Indexing: After a fresh page is discovered, Google tries to understand what the page is about. This is the process that is called indexing. Google analyzes the content of the page, cataloguing images and other files embedded on the page, trying to understand the page’s content. It also processes “meta” information, such as Title tags and image ALT attributes.
- Ranking: When someone searches, Google tries to find the most relevant answers from within its index. Factors such as the user’s geographical location, language, device, and previous search behaviour are all considered when selecting the best possible results, so a “search ranking” is not universal. It has also become more and more apparent that click data is also used on search results, therefore appealing to the search user and delivering on your promises matters as much as getting ranked to begin with.
How to Help Google Index
Google regularly updates its algorithms and methods for indexing and ranking, so it’s a good idea to stay informed about the latest search engine optimization (SEO) practices.
To help Google find, index, and rank your site, here are some tips:
- Provide clear navigation: Help users quickly find what they’re looking for and help search engines understand what the page is about. Every single important page on your site should be navigable within as few clicks as possible, with zero “orphan pages”. One of the key ways I have been able to help people turn their traffic around is an internal linking spreadsheet report I generate using my Python scripts.
- Responsiveness: Pages that load reliably and quickly have a better chance of being indexed, and rank higher.
Optimize for Crawling
When the bot comes visiting you do want to get indexed but also you want your content to be seen as valuable and not placed in the junk pile:
- Create rich content: This is not just about “quality content” but ensuring you clearly and expansively answer the questions that your topic demands. This helps search engines understand the value of your site and helps your pages rank for the right search queries.
- Use keywords smartly: Place keywords where they are most relevant and in a natural way. Also refer to your own pages using those keywords.
- Optimize title tags and meta descriptions: This helps with click-through rates because as mentioned earlier, search results pages track clicks to determine which results are attractive to users and satisfy the search intent.
Fixing Your Google Indexing
Google will tell you if they have issues indexing your site in a dedicated section of Google Search Console:
As you can see, I still have some work to do if I am going to fix my site.
While a valid sitemap will help you get discovered, as will generating more links pointing to your website, you need to go through any issues preventing the bots going to the next step and analyzing your content.
Those last two line items caught my attention, however – it seems I have a bunch of pages still queued up, not yet indexed.
Submit Pages for Indexing
Google Search console has a field where you can enter a URL to get the status of that page, and then optionally request it for consideration in the index. In my experience that is a tedious process, and their response times (if they do anything at all) are extremely slow.
Instead I wrote my own software that bulk submits URLs using the Google API. This is something I can do for you too, though it does mean you giving me temporary access to your Google Search Console which people are understandably not always willing to grant.
If you would rather use a third-party tool, URL Monitor is dedicated to the indexing cause (not an affiliate link, not associated with the service), uses the same API, and if all you are concerned about is indexing would be cheaper than going with me too.
So many people put the SEO topic to bed after building and launching their site, but traffic and audience growth are moving targets and require ongoing effort.
Indexing issues are one element of the whole “traffic decay” problem plaguing my site, and likely yours too.
Make sure you are using Search Console and that you at least keep on top of the issues found there. While they do send emails notifying when they discover problems, you need to have a good review of where you are at, where you have been, and anything new that is occurring if you are going to keep the traffic not just coming in, but growing.