Checking the number of webpages of this blog that are indexed by Google is one of my daily activities. It’s like I’m already addicted on seeing the result when “site:”- searching Google. And I am really happy upon learning that there are more or less 11,000 webpages of this blog, “I Make Money by Blogging” that are already indexed by Google.
You know, I am always working hard so that Google will index every webpage of this blog. And having more than a ten thousand of webpages that are already indexed by Google is quite a success.
But why am I doing this? Why am I trying to let Google index every webpage of this blog?
I’ve been thinking these past days and I realized that the more webpages of this blog are indexed by Google, the more chances that my blog has to top the SERP of Google.
Just imagine if your blog has 100 posts but only 10 of them are indexed by Google. This only means that only 10 posts of your blog have the capability to bring traffic into your blog from the Google SERPs and the other 90 posts are useless.
Now, what if all the 100 posts of your blog are indexed? If your blog’s 10 posts bring 1000 visitors into your blog from the Google SERP, then how much more if all the 100 posts are indexed by Google?
Getting high ranking in SERP of Google is really a competition between webpages of the indexed blogs and websites. And I understand that those webpages that are better SEOed are the ones that will be ranked at top position of the SERP.
To let you know, I targeted a lot of keywords that have large volume of searches. At first I successfully got the top position of the SERP of Google, but as time goes by the competition in getting the top SERP is also getting higher and my blogposts failed to stay on the top position of the SERP.
I haven’t lost hope. I continue SEOing those important posts hoping that one day, they will be ranked high up of the SERP again.
Webpages are deindexed by Google
But the problem is, as I am doing the SEO, other webpages of the blog are being dropped by Google from its index.
There were times that Google revealed that there were 12,000 webpages of the blog that are already in the index. However, this June 2009 I noticed that there were only more or less 8,000 pages of this blog that remained in the Google index.
SEO experts suggested that the possible causes why Google deindex webpages are:
- Duplicate content within the blog. Google found several pages and indexed them even if in reality those pages were just duplicate contents. That’s the reason why the number of webpages in the index was high. But after Google realized that those are duplicate content, Google immediately dropped those pages from the index.
- Google Penalty. SEOing a certain webpage should be done in moderation because over-SEOing them will only lead to deindexing of the affected webpages, or the entire website or blog.
- Google consider those pages untrusted. There are lots of websites and blogs out there that contains malicious codes in which everytime you visit one of them, the code is automatically downloaded and installed into your computer. There are other websites and blogs that are used by blackhat SEOptimizer in linkfarming for their websites being optimized.
Now, if one of the webpages of your blog looks similar to the mentioned websites, then there is a possibility that Google will deindex the webpages of your blog, or the entire website or blog.
- Server problem. Some experts said that when Google tried to access a certain webpages of your blog but then failed to access them because of server problem, those webpages will be dropped from the index and consider them “unreachable”.
This is what I actually noticed too.
In the Google Webmasters tools, webpages of my blog that are included on the “Unreachable” section are not also included on the index.
Now, our problem is how could we avoided the deindexing process of Google? And how could we made Google index all the the webpages of our blog or website?
Here’s what we need to do:
- Install a sitemap into your site and make use of the Google Webmaster’s tools. Errors in indexing webpages of your blog are reported once Google finds them, so it can help you a lot in fixing the problems.
- Make your content unique even if the topic you’re talking in every webpages of your blog is common. The title and meta tags of every webpage should be unique too. This is to avoid being considered by Google as duplicate contents.
- Avoid being penalized by Google. Make your own linking strategy natural and don’t over SEO your blog. You’ll never succeed in blogging and making money online once your blog is penalized already by a search engine, except of course if you can duplicate the ability of John Chow.
- Get more quality backlinks and avoid getting links from a linkfarm. Getting backlinks from a relevant webpage and from a high PR domain is considered the best thing for a blog. But links from a link -farm will only harm your blog.
- Find a better webhost which will host your blog. This is to make sure that every time Google visits your blog and every webpage of your blog, Google can access them, and avoiding the possibility that any webpages of your blog will be included on the “Unreachable” pages.
I think, these five steps can help us a lot so that Google will index 1,000% of the total webpages of our blog. If that happens, then I’m pretty sure that our blog’s webpages will have a better chance in ranking well at the Google SERP in various searches.
And just like what I am always saying… the more webpages of your blog that are ranked well on Google SERP, the more visitors your blog will get…. the more income you’ll gain from your blog.