SEO Tools
robots.txt & sitemap.xml
Generate an SEO-optimized robots.txt and sitemap.xml in a few clicks. Configure, preview, and download.
robots.txt
sitemap.xml
Direct download
Configuration
Select multiple to set rules per bot
One path per line
robots.txt
Click "Generate" to see the output…
Best practices
- ✓Place robots.txt at root:
example.com/robots.txt - ✓Include the absolute URL of your sitemap
- ⚠robots.txt is not a security mechanism — malicious bots ignore it
- ⚠Blocking GPTBot and CCBot prevents AI training data collection
Configuration
Path
Freq.
Priority
Leave empty for today's date
sitemap.xml
Click "Generate" to see the output…
Best practices
- ✓Submit your sitemap in Google Search Console
- ✓Reference it in robots.txt via
Sitemap: - ✓Max 50,000 URLs or 50 MB per sitemap file
- ⚠Only include canonical pages — not duplicates or redirects
robots.txt vs sitemap.xml
robots.txt
Tells crawlers which pages to crawl or ignore. Useful for blocking admin areas, duplicate pages, or unnecessary resources.
sitemap.xml
Lists all important pages of your site to help search engines discover them. Includes URL, modification date, and priority.
SEO impact
A well-structured sitemap speeds up indexing of new pages. robots.txt prevents crawl budget waste on pages with no SEO value.
Block AI scrapers
Block GPTBot (OpenAI), CCBot (Common Crawl), anthropic-ai and Google-Extended to prevent your content being used for AI training.
FAQ
Yes, if you accidentally block important pages. A Disallow:/ directive blocks your entire site. Always verify with Google Search Console after changes.
priority (0.0 to 1.0) indicates the relative importance of a page on your site. changefreq is the estimated change frequency. Google uses these as hints, not strict rules.
No. Only include canonical pages with valuable content. Exclude pagination pages, tag pages, duplicate content, and pages blocked by robots.txt.