Robots.txt Generator

Create robots.txt files for search engine crawlers with ease.

Robots.txt Generator

Create robots.txt files for search engine crawlers

User-Agent Rules

Rule 1

Sitemap URLs

Generated robots.txt

User-agent: *
Disallow: /admin
Disallow: /private

Sitemap: https://example.com/sitemap.xml
About robots.txt: This file tells search engine crawlers which pages or files they can or cannot request from your site. Place it in the root directory of your website. Use User-agent: * to apply rules to all crawlers, or specify individual bots like Googlebot.