Loading...
Loading...
Generate robots.txt rules for search engine crawlers.
Pick from Allow All, Block All, Standard, or Block AI bots.
Configure user-agent, allow, and disallow paths.
Copy the generated robots.txt and upload to your site root.
Allow All, Block All, Standard, and Block AI Bots.
Add rules for different user agents.
Include sitemap URL and crawl delay.
Always outputs valid robots.txt format.
Upload to your website root: https://yourdomain.com/robots.txt