The robots.txt file is a text file located at the root of your website. It is the first thing a search engine bot looks for. It acts as the "Do Not Enter" sign for your website.

The robots.txt Generator helps you create valid syntax to block specific bots or folders.

Why Use It?

1. Security / Privacy

You don't want your admin panel (/wp-admin/), user profile scripts, or cart pages showing up in Google Search results.

2. Saving Crawl Budget

Google only spends a certain amount of time on your site. Don't waste it on low-value pages like "Terms and Conditions" or internal search result pages.

3. Blocking Bad Bots

Some bots scrape your content to steal it (e.g., GPTBot used by OpenAI). You can explicitly Disallow them.

Key Commands

  • User-agent: *: Applies to ALL bots.
  • Disallow: /private/: Do not look inside this folder.
  • Allow: /private/image.jpg: Exception to the rule above.
  • Sitemap: https://...: Tells the bot where your sitemap is.

A Warning

robots.txt is a public request, not a firewall. A polite bot (Google) will respect it. A malicious bot (scraper/hacker) will ignore it. Do not rely on it for actual security.