Using Google Webmaster Tools for Page Ranking

Google’s dominance of the web search market is at about 80%, so when they create a tool to help improve your rankings it’s always advisable to try it out. This is precisely what Google Webmaster Tools offers. Using this, users can add up to 1000 sites including news and mobile sites to their account.

When you create an account with Google Webmaster Tools, you will be asked to verify your site to make sure you own a website before Google allows you to access all the information and use their tools. This doesn’t affect the site’s page rank in any way. Moreover, if you create blogs on Blogger, you can automatically add and verify site after enabling the Webmaster Tools from the Blogger

dashboard.

Let’s get into the technicals. After your account is made active, you need to add your site and verify it by placing a string on the HTML head section.

The first thing to do after this is to go to ‘Site Configuration’ and then ‘Settings’ from where you

can select your ‘Preferred Domain.’ The reason this is done is to make sure that Google will index only one version of your site (either the one with the www. prefix or the one without it. This is just for preference and has no impact on ranking. When doing this, make sure you choose a version that is actually working on your site and not one that redirects to another version.

At this point no further information can be gathered on your site so you can let it remain. After a few weeks, login again and explore the other features available to you.

An important section that demands attention is ‘Crawl Errors’ under the ‘Diagnostics’ link. This will identify any problems Google is facing while crawling your site. This is a good indicator to find and fix issues, so remedy issues as soon as it is found and before it affects your search rankings.

After exploring ‘Crawl Errors’ take some time to explore the ‘Crawl Stats’ and ‘HTML suggestions’ under the same tab. The stats are a breakdown of how frequently the Google bot is visiting your site. The graph should ideally be stable if not growing. If you see a decline, it means something is going wrong. The more often the bot visits your site, the greater your authority on page ranking.

The ‘HTML suggestions,’ reports problems with meta tags. Large numbers of duplicate title tags creates a problem. To resolve this, you need to revise your blog structure.

Not all errors are as serious. The “Not Found” error is a common one that comes across when owners link to the wrong URL within the site. ‘Restricted by robots.txt’ errors can be caused by disallowing crawling on areas around your website. ‘Timed out’ and ‘Unreachable’ are ones to look out for. If these come up it means the Google bot is not reaching your pages they way it should. If you get these errors frequently it could mean your sight might get de-indexed.

These are key sections to look out for in the Google Webmaster Tools related to the performance of your site on Google, so it’s important to monitor this at least once a month. There are other interesting sections that you can explore and experiment with once you get the hang of it.

Comments are closed.