offlinewebtools.com
Robots.txt Generator
Generate robots.txt files to control search engine crawlers
Processes in your browser — never sent to our servers
Select which bots these rules apply to (* = all bots)
Common patterns: / (all), /admin/ (admin folder), /*.pdf (PDF files)
Time to wait between requests
User-agent: * Disallow: /admin/
Tips:
- Place robots.txt in your website's root directory
- Use Disallow: / to block all crawlers from your entire site
- Use Allow to override more restrictive Disallow rules
- Test your robots.txt with Google Search Console