/
/private/
/admin/
Generated robots.txt
Create a robots.txt file for your website to control search engine crawlers. Free online robots.txt generator with allow and disallow rules.
A robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site. This is used mainly to avoid overloading your site with requests. Our robots.txt generator makes it easy to create proper robots.txt files with User-agent directives, Allow and Disallow rules, and Sitemap references.