Why do you limit the number of URLs in a sitemap?
We get asked this quite a lot…..
The main reason is that XmlsSitemapGenerator is a free tool and generating sitemaps is not a free process.
Our spider indexes thousands of pages a day utilizing lots of server resources (Memory, CPU and bandwidth) andΒ racking up gigabytes of data as it indexes pages and builds up profiles.
The overhead of the spidering process is quite large especially at busy times when we have many people generating sitemaps. To help ease the pressure we have a queuing mechanism to avoid a log jam on the server, however that means people then have to wait.
Therefore to control resource utlisation and minimize wait times we limit the number of urls our spider will crawl for a given website.
Over time we have increased the number of urls we accept from 50, to a 100 and at the time of writing we now support 250. Of course we constantly review this and may decide to raise or lower it in the future.
From reviewing our statistics 99% of users who use our free xml sitemap generator don’t hit this limit and many don’t even come close, so for now we are happy that the tool is fit for purpose.
What if I have more urls?
Generally if you have a website that comprises of more than 100 pages you are probably using some sort of content managment ystsem (CMS) or an ecommerce system. With such systems there are much better ways to create sitesmaps faster and more effectively directly from the database. Many systems support adaptes and plugins to help you generate your sitemap.