DevDockTools

Robots.txt Generator

Create a complete robots.txt file using preset rules for common crawlers. Control access to specific directories and add custom rules with a user-friendly interface.

Rule 1

No disallow rules

User-agent: *
Allow: /

Sitemap: https://example.com/sitemap.xml

Frequently Asked Questions

What is robots.txt used for?

robots.txt is a file that tells search engine crawlers which pages or sections of your site they can or cannot request. It's the first file Googlebot reads when visiting your site.

Does robots.txt block indexing?

No — robots.txt only blocks crawling, not indexing. To prevent a page from appearing in search results, use a noindex meta tag instead.

How do I block AI crawlers like GPTBot?

Use our 'Block AI bots' preset to automatically add Disallow rules for GPTBot, ChatGPT-User, CCBot, and other known AI training crawlers.

Related Tools