ROB

Robots.txt 生成

生成搜索引擎抓取规则 robots.txt

🔒 100% client-side — your data never leaves this page
Rules
robots.txt
User-agent: *
🔒 100% client-side

About this tool

通过可视化输入快速生成 robots.txt 文件,支持 user-agent、allow/disallow 路径以及 sitemap 声明。适合 SEO 上线前配置、阻止敏感目录抓取、保障核心页面正常索引。可直接复制到站点根目录部署。

Frequently Asked Questions

What does robots.txt control?

It instructs search crawlers which paths are allowed or disallowed for crawling.

Should I include Sitemap in robots.txt?

Yes. Including a Sitemap directive helps crawlers discover your sitemap quickly.

Does robots.txt prevent indexing completely?

Not always. It controls crawling, not guaranteed indexing. Use noindex where needed.