Free SEO Tool
Robots.txt Generator
Create a properly formatted robots.txt file to control how search engines crawl and access your website content.
How to Use
- 1Set the user-agent (default * for all bots)
- 2Add allowed and disallowed URL paths
- 3Set sitemap URL and crawl delay if needed
- 4Copy the generated robots.txt content
Common Disallow Paths
- /admin/
- /wp-admin/
- /private/
- /cgi-bin/
- /tmp/
- /*.pdf$
Robots.txt Explained▼
The robots.txt file is a standard used by websites to communicate with web crawlers. It specifies which areas of the site should not be processed or scanned by search engines. Place the file at your domain root (e.g., example.com/robots.txt).