Robots.txt Generator - Control Search Engine Crawlers
Create robots.txt files with visual builder. Presets for WordPress, Shopify, Next.js. Configure crawl rules, sitemaps, and crawl-delay for SEO.
Quick Presets
Add Rules
Current Rules
- No rules added yet
robots.txt
Syntax Reference
| Directive | Description |
|---|---|
User-agent: * | Applies to all crawlers |
Disallow: /path/ | Block this path |
Allow: /path/ | Explicitly allow (overrides Disallow) |
Sitemap: URL | Location of XML sitemap |
Crawl-delay: N | Wait N seconds between requests |
Wildcard Patterns
* matches any sequence of characters
$ matches end of URL
Examples:
/*.pdf$ - blocks all PDFs/private/* - blocks /private/ and subdirectories