What are sitemaps.xml and robots.txt?

Sitemaps

A sitemap is a protocol which enables a webmaster to inform the search engines of a website's available URLs for automatic indexing. It is literally a map of your website, which can be understood by indexing robots (such as the Googlebot, the Bingbot etc).

 

Written under XML or txt format, a sitemap gathers one website's URLs alongside with information such as; its latest modification date, the update frequency, and its importance in relation to the other webpages from the same site.

 

Even though there are different formats of sitemaps, they all have to be 50 Mo maximum, and have to include not more than 50 000 URLs. If your file is larger than 50 Mo or includes more than 50 000 URLs, you will have to split your list into several sitemaps.

Continue reading?

Share this article

Continue reading
asds