One of the areas that we have not touched upon so far in our posts in this series is Sitemaps. When we are talking about Sitemaps with reference to SEO, we are referring to Google’s Sitemap as opposed to the regular static page that we find in websites that gives the navigation structure of the website.
Google introduced its concept of Sitemap in 2005 as part of one of its experiments. This is what Google had in its mind when it first announced its Sitemap concept, “Initially, we plan to use the URL information webmasters supply to further improve the coverage and freshness of our index.”
Google Sitemap’s will tell Google’s search bot how often it should visit the site and how frequently the page’s content will be updated. The Sitemap kind of schedule’s Google bot’s visit. This helps websites whose content is regularly updated to get the latest version indexed. If there is no such sitemap Google will perform its regular crawling, which may have several months gap between the two successive visits. Meanwhile you might have made several updates to your site and online users will miss those updates in the search results because Google’s cache will hold only the older version of your site’s material. This may make you lose a certain percentage of the visitors who decide whether to click on your website or not in the search results page looking at the meta description. This will unnecessarily make you lose visitors.
Moreover, when the search engine bots crawl your website, it does not find some of the pages for various reasons. Some of the pages may not be accessible through site navigation menu but they can be accessed only through the site’s search feature. This is a typical condition with ecommerce sites. In such cases, the products pages will never get indexed which is not to leverage the maximum potential of your website. On the other hand, when you supply Google with a sitemap that lists all the pages in your website including hidden products pages, Google’s search bot will be able to a better job.
Google map is a XML file, which also contains number of factors such as list of URLs, change frequency, date of latest modification etc. Google bot will use all these information to get the latest information from your site by visiting your site at the right frequency.
If your website is managed dynamically, then it is imperative that you have an XML sitemap with the latest list of URLs without which your website will be poorly indexed.
Another important information that Google bot gets from the Sitemap is the relative importance of various pages. Only you can know which page is the most important page and which are less important and no one else will know better in this regard. By giving this information to Google, it will make a note of this information to be used for the next crawl and not necessarily to affect the ranking of the pages.