Sitemap Generator

URLs processed locally — never sent to a server

Paste a list of URLs, get a valid sitemap.xml. No URL cap, no upload, runs entirely in your browser.

URLs

One per line. Each URL is validated as you type.

0 valid · 0 invalid · 0 duplicates

Default values

Applied to every URL unless overridden.

Hint, not directive — Google may ignore. Use to order pages by importance.

Hint to crawlers about how often the page changes.

W3C Datetime format (YYYY-MM-DD). Google reads this most reliably.

Runs entirely in your browser — nothing is uploaded
Runs entirely in your browser. No uploads. Your files stay private.

Sitemap.xml: What It Is And When You Need One

A sitemap is a structured list of every URL on your site you want search engines to know about. It's a hint, not a guarantee — Google still decides what to crawl and index — but a well-formed sitemap reliably accelerates discovery, especially for new pages and pages that aren't well-linked from elsewhere on your site. The format was standardised by the Sitemaps protocol in 2005 and is now respected by every major search engine.
The XML structure is simple: a single <urlset> root with <url> entries. Each entry has a required <loc> (the URL) and three optional fields — <lastmod>, <changefreq>, and <priority>. The optional fields are conventional but Google has publicly stated it largely ignores changefreq and priority. lastmod is the most-respected hint and arguably the only one worth setting carefully.
Per the spec, a single sitemap file can contain up to 50,000 URLs and must not exceed 50 MB uncompressed. For larger sites, you split content into multiple sitemap files and reference them from a sitemap index — for example sitemap-blog.xml, sitemap-products.xml, and sitemap-pages.xml all listed in sitemap.xml at the root. The generator here flags when you cross the 50,000 limit so you can plan the split.
URLs in a sitemap must all share the same origin (protocol + host) as the sitemap itself. A sitemap at https://example.com/sitemap.xml cannot reference https://blog.example.com/post or http://example.com/legacy. The generator detects mixed origins in your input list and surfaces a warning before you ship something Google will silently reject.
Special characters in URLs need XML entity encoding inside <loc>. The generator applies the five-character escape (&, ', ", <, >) automatically — but if you're hand-writing sitemaps elsewhere, that's a common gotcha. URLs with query strings (utm_source=...) get encoded correctly here.
Sitemaps belong in your site root by convention (yourdomain.com/sitemap.xml) and should be referenced from robots.txt with a Sitemap: line. After deployment, submit the sitemap URL via Google Search Console and Bing Webmaster Tools. Most CMS platforms (WordPress, Hugo, Next.js) generate sitemaps automatically — this generator is for the cases that don't, like a hand-rolled site, a CMS that lacks a sitemap plugin, or a one-off curated list of important URLs.
All URL processing happens in your browser. The site list, validation results, and final XML never leave your device. There's no upload step because there's nothing to upload — the generator is an XML string template plus a Blob download.

Common Use Cases

01

Hand-rolled static sites

Hugo, Eleventy, or pure HTML projects without a sitemap plugin can paste their URL list and get a valid sitemap.xml in seconds.

02

Migrating to a new CMS

Export the URL list from the old site, generate a sitemap, drop it on the new site so Google's index transition is faster.

03

Shopify / Wix sitemap supplement

Most platforms auto-generate one sitemap; for landing pages or campaign URLs hosted elsewhere, generate a supplemental sitemap and reference it from the main.

04

Audit existing sitemaps

Paste a list of URLs from your CMS-generated sitemap into the input to validate and see invalid entries flagged.

Frequently Asked Questions

Most CMSes (WordPress, Shopify, Wix, Webflow) generate a sitemap automatically. Use this generator if (a) your platform doesn't, (b) you want to add curated URLs the platform missed, (c) you need a per-language or per-section split, or (d) you're hand-rolling a site.
A sitemap (urlset) lists actual URLs, capped at 50,000 entries. A sitemap index lists other sitemaps — used when you exceed the per-sitemap limit. Large sites typically have one sitemap.xml at the root that's actually a sitemap index, pointing to several urlset files.
Almost certainly not. Google has stated publicly that priority and changefreq are largely ignored. They're sometimes useful for self-documentation (which pages do you consider important?) but don't expect ranking impact. Focus your effort on content quality, internal linking, and core-vitals scores.
W3C Datetime — YYYY-MM-DD is the simplest correct form. Google specifically recommends this format. The generator outputs it that way; manually-written sitemaps sometimes use other formats that get parsed inconsistently across crawlers.
No. Per the protocol, all URLs in a sitemap must share the sitemap's origin (protocol + host). The generator detects cross-origin URLs in your input and warns. To handle multiple subdomains, generate one sitemap per subdomain.
Two ways. Add a Sitemap: https://yourdomain.com/sitemap.xml line to robots.txt — Google reads it on crawl. And submit the sitemap URL directly via Google Search Console (Property → Sitemaps). Both work; Search Console gives you submission feedback (number indexed vs total).
Yes — Google enforces a per-sitemap cap of 50,000 URLs and 50 MB uncompressed. Over those limits, a sitemap index splits into multiple urlset files. The generator flags when you cross 50,000 so you can plan the split.
Only the ones you want indexed. If a URL is noindex'd, gated by login, or you don't care about its rankings, skip it. Including noindex'd URLs wastes Google's crawl budget on pages it shouldn't index anyway.
Conventionally yes (yourdomain.com/sitemap.xml). Technically you can host it at any path and reference it from robots.txt or Search Console — Google reads from wherever robots.txt declares — but tooling and conventions assume root placement.
No. URL parsing, validation, deduplication, and XML generation all happen in your browser. The download is a local Blob. The site list never leaves your device.

Advertisement