robots.txt Generator

Create a valid robots.txt file for your website β€” add bots, rules, sitemaps and crawl delay.

Ad
Quick presets
Bot rules
robots.txt output
Ad

About robots.txt

The robots.txt file is placed at the root of your website (e.g. https://example.com/robots.txt) and tells web crawlers which pages or sections they are allowed or disallowed from indexing. It uses the Robots Exclusion Protocol β€” a standard supported by all major search engines including Google, Bing and Yandex.

A rule block starts with User-agent: (the bot name, or * for all bots), followed by one or more Allow: or Disallow: directives. Disallow: / blocks the entire site; Disallow: with nothing allows everything. The Sitemap: directive tells crawlers where to find your XML sitemap. Note: robots.txt is advisory only β€” malicious bots may ignore it.


Ad