Creating XML sitemaps for blogs is not that difficult. Almost all content management systems have in-built support for creating site specific XML sitemaps, all you have to do is tweak through the settings at the backend of your CMS and the XML sitemap is generated dynamically. Content management systems such as WordPress, Joomla, Drupal, BBPress have plugins and scripts which let’s you create XML sitemaps in just a few mouse clicks. No coding or HTML skills are required, so this is ideal for novice webmasters.
But creating a complete XML sitemap for your static website is not very easy and requires some head hunting. There are basically three problems with hand coded custom sitemaps:
You have to manually update the code of the sitemap.xml file, once new pages are added to your website/blog/forum.
You have to remove redundant entries, duplicate pages, noindexed content and maintain the exactness of the sitemap.xml file.
You have to remove pages that no longer exist on your domain but were a part of your site, a few years ago.
Managing the XML sitemap of your static website can get really messy, when the size of your website grows beyond 100 pages. You can always hand code it on your own but I would recommend using a sitemap crawler because this will autogenerate the XML sitemap for your entire domain, whenever you want to.
If your website has less than 500 pages, you can create the sitemap at XML-Sitemaps.com. Go to this website, type in the addres of your domain’s home page and choose the following settings:
1. Change frequency: Daily (blogs), weekly (corporate websites) 2. Last modification: Use server’s response
3. Priority: None
Click “Start” and you are done. The application will automatically scan your entire website, list all the available URL’s in the sitemap.xml file and will let you download it on your computer.
Once you have downloaded the sitemap.xml file, use any FTP program e.g FileZilla and upload it on the root directory of your domain (example.com/sitemap.xml).
On the other hand, if your website has thousands of pages, you will need a dedicated sitemap crawler application which can scan all the available URL’s of your website and create the sitemap.xml file. Hand coding all the URL’s on your own is going to take weeks, so using the desktop crawler is a much better option.
GSiteCrawler is an excellent sitemap generator application for Windows users which can be used to create XML sitemaps for large websites that are without a content management system. This application has been developed by John MU, one of the Googlers of Google Webmaster Tools team and it is without doubt, one of the best programs for creating a Google complaint XML sitemap for any website, blog, forum or a corporate website.
Using the tool is simple, create a new project, enter the domain address and choose the following settings:
1. URL(s): Case sensitive 2. Remove HTML comments before parsing pages: Yes 3. File extensions: asp,aspx,cfm,cgi,do,htm,html,jsp,mv,mvc,php,php5,phtml,pl,py,shtml 4. Include priority. 5. Include frequency.
6. Include last modification date and timestamp.
Additionally, you can create a gzipped version of the sitemap and generate vital statistics for all the pages of your website. This includes, filtering URL’s in a specific directory, sorting URL’s that have been changed since last month, subdomain list and exporting all the URL’s as a text file.
The biggest advantage of GSitecrawler is its “Recrawl” feature which literally recrawls your website from the point it left, the last time you ran the project. This helps in generating the most recent version of your website’s sitemap in seconds, without having to do it from scratch.
About Author
This guest post is written by Lior Levin, a marketing consultant for the Tel Aviv University department in political communication major, and who also works for a center that provides new cancer treatments.