Skip to content

Sitemaps: no point unless they’re accurate

There’s an argument that having a sitemap on a website is not necessary if the site is well designed; and it’s true that we don’t get penalised for not having one. However, it can be advantageous in search engine coverage if the site isn’t simple for search engines to navigate. Most importantly though, if it does exist, it needs to be up-to-date and accurate.

Most sites should have their sitemap at [domain name]/sitemap.xml – which is where most search engines will look for it. Google Search Console and Bing Webmaster Tools allow us to specify the location, which is useful if we want to have it somewhere non-standard, but I wouldn’t recommend that. Some sitemap creation tools might produce a whole list of sitemaps, but these should be listed in the sitemap.xml file (see for an example).

Any decent content management system (CMS) should maintain the sitemap in real time. However, I have seen sites where someone created a ‘manual’ sitemap a long time ago, and it’s not been updated since.

Keeping the CMS in control of all site content is therefore very important if it’s to appear in the sitemap – people do upload stuff to their sites outside of the CMS, which is fine, but remember that means it won’t then appear in the sitemap.

That won’t matter if it’s linked clearly from the site – no search engine crawler takes a sitemap as the definitive list of what’s on the site – but the crawlers do need real links. I once had someone tell me that they’d manually uploaded a white paper to a documents folder on their site, and promoted it via an email to customers, but wondered why search engines hadn’t picked it up. You can probably work out why.