Robots.txt & Sitemap.xml Generators

Two clean tabs. Mobile-first controls. Copy or download in one click — 100% in your browser.

Robots.txt — Easy Mode

Pick a preset, then add simple Allow/Disallow rules. Manage user-agents with a dropdown.
URL: —
Preset
User-Agent
Add Rule
Rules for *

Preview

# Click “Generate robots.txt”

Tip: If no rules exist, we’ll emit a blank Disallow: (allow all).

Everything runs locally in your browser. We don’t upload, store, or see your files.

Clean Robots.txt

Build precise rules per User-agent, with Allow/Disallow, wildcards, and crawl-delay.

Full Sitemap Suite

Create standard sitemap.xml, sitemap index, plus image, video, and news sitemaps.

Validate & Test

Lint XML, check sizes & limits, preview how bots read your robots.txt with different user-agents.

Smart URL Handling

Auto-add lastmod, changefreq, priority, canonical HTTPS, and xhtml:link hreflang.

Robots ↔ Sitemap Link

Insert Sitemap: directives into robots.txt automatically; build multiple entries for big sites.

Private & Local

Everything runs in your browser — generate, review, and download with zero uploads.

How to Generate Robots.txt & Sitemap.xml (Quick Steps)

Build both files in minutes — optimized for SEO, safe for crawlers, and fully under your control.

  1. Click Add Rule to insert User-agent and Allow/Disallow paths for your robots.txt file.
  2. Use Presets (Blog, Store, Staging, etc.) if you want a ready-made starting point you can adjust.
  3. Scroll to the Sitemap Generator and paste or type your site’s URLs (pages, posts, or custom links).
  4. Optionally set lastmod, changefreq, and priority for each entry to help search engines understand updates.
  5. Click Generate to instantly produce both robots.txt and sitemap.xml previews.
  6. Use Copy or Download to save your files locally. Upload them to your site’s root directory when ready.
  7. All processing happens locally in your browser — nothing is uploaded or stored on our servers.

Robots.txt & Sitemap.xml — Frequently Asked Questions

Answers about SEO rules, crawl control, sitemap setup, privacy, and best practices.