Robots.txt Generator
User-agent: *
Disallow:
What is a robots.txt file?
A robots.txt file tells search engine crawlers (like Googlebot) which pages or files they can or can't request from your site. This is used mainly to avoid overloading your site with requests.
Common Rules:
- User-agent: * means the rule applies to all crawlers.
- Disallow: /admin/ prevents crawlers from indexing your admin dashboard.
- Sitemap: helps crawlers find your sitemap quickly.