100% Private

Robots.txt Generator - Control Search Engine Crawlers

Create robots.txt files with visual builder. Presets for WordPress, Shopify, Next.js. Configure crawl rules, sitemaps, and crawl-delay for SEO.

Quick Presets

Add Rules


Current Rules

  • No rules added yet

robots.txt

Syntax Reference

DirectiveDescription
User-agent: *Applies to all crawlers
Disallow: /path/Block this path
Allow: /path/Explicitly allow (overrides Disallow)
Sitemap: URLLocation of XML sitemap
Crawl-delay: NWait N seconds between requests

Wildcard Patterns

* matches any sequence of characters

$ matches end of URL

Examples:

/*.pdf$ - blocks all PDFs
/private/* - blocks /private/ and subdirectories

Related Tools

View all Dev configurators →

Privacy Notice: This site works entirely in your browser. We don't collect or store your data. Optional analytics help us improve the site. You can deny without affecting functionality.