SiteVisibility

You Are Viewing

A Blog Post

ABC’s of SEO – X= How To Create An XML Sitemap

What is an XML sitemap?

An XML sitemap allows for search engines such as Google to be able to understand what is on your site and crawl it accurately.

The sitemap tells the search engines what URLS there are on a website and allows them to be crawled. This is an important step for your website as sitemaps enable search engines to find all of the pages on your site which might be missed when indexing.

xml sitemaps

Why are XML sitemaps useful?

There are several key pieces of information which your sitemap allows you to specify regarding your URLS:

  • The time they were last updated – this is useful as search engines know how fresh the content is.
  • How frequently the site has changed – this tells search engines how often to come back to the content.
  • The importance of the page compared to other pages on the site.
  • Finally they can provide links to areas on your site which are not otherwise linked (but that you want indexed).

By possessing this additional information, your site will be crawled in a more intelligent way. Additionally, sitemaps are also known as an inclusion protocol whereas a robots.txt file is exclusionary.

Robots.txt file is viewed by search engines before anything else on your site (http://www.yourwebsite.com/robots.txt). This file can be used to disallow pages on your site meaning that search engines will not crawl them (example below).

User-agent: *

Disallow: /

This file should be placed in the top level of your domain i.e. within the public_html folder (http://www.yourwebsite.com/robots.txt). This is a text file which can contain one or more records and therefore you will need to have a line for every URL you wish to exclude search engines from crawling (example below).

User-agent *

Disallow: /tmp/

Disallow: /test/

 

How do I create an XML sitemap?

This is a simple process which can be done via several websites, with Google recommending http://www.xml-sitemaps.com/.

There are only 4 steps on that particular site to complete: URL, frequency, last modification and priority for crawling. Once this is complete you’ll need to download the file onto your computer and then upload it to the root of your domain.

Uploading your XML sitemap

After you have submitted your sitemap to the server you will also have to submit the full URL to the sitemap. Once you have then submitted your sitemap to the main search engines, your site will be crawled more appropriately enabling all your pages to be found.

Google – you’ll have to use webmaster tools in order for the search engine to know the full URL of your sitemap.

Bing – Here you will need to use windows live accounts and you’ll find the sitemap section under the webmaster centre area.

Yahoo – on this search engine you will have to sign into yahoo and insert the URL within the field ‘submit site feed’.

Related Posts Plugin for WordPress, Blogger...
2 Comments
  • David Lockie on May 17, 2012

    Nice overview Simon.

    A few things to consider too:

    – a sitemap index (a sitemap sitemap) is necessary if you have more than 50K URLs, so it kinda makes sense to start out with that structure to allow for growth – it’s not a complicated system.

    – you should declare your xml sitemap (or index) in your robots file using the syntax:

    sitemap: http://www.yoursite.com/sitemap.xml

    – WordPress and other CMS will have plugins available to do all this for you – SEO for WordPress by Yoast includes this if you’re running WordPress.

    Hope that’s useful.

    David

  • Simon on May 17, 2012

    Thanks David for the info.

    We always encourage some debate around our blog posts so if anyone else has some points to raise – feel free!

Leave a Reply