Webmaster Area·robots.txt Generator

robots.txt Generator

Generate a properly configured robots.txt for your adult website

Auto-filled from Site URL. Edit to override.

# RTA content rating: RTA-5042-1996-1400-1577-RTA
#
# This file controls how search engine crawlers access this site.
# Generated by Porn-Directory.net robots.txt Generator

# Search engines
User-agent: Googlebot
Allow: /
Disallow: /admin
Disallow: /login
Disallow: /api

User-agent: Bingbot
Allow: /
Disallow: /admin
Disallow: /login
Disallow: /api

# AI crawlers - blocked
User-agent: GPTBot
Disallow: /

User-agent: CCBot
Disallow: /

User-agent: Google-Extended
Disallow: /

User-agent: Bytespider
Disallow: /

User-agent: anthropic-ai
Disallow: /

User-agent: ClaudeBot
Disallow: /

User-agent: FacebookBot
Disallow: /

User-agent: Amazonbot
Disallow: /

# Image scrapers - blocked
User-agent: img2dataset
Disallow: /

User-agent: Scrapy
Disallow: /

User-agent: MJ12bot
Disallow: /

User-agent: AhrefsBot
Disallow: /

User-agent: SemrushBot
Disallow: /

User-agent: DotBot
Disallow: /

# All other bots
User-agent: *
Allow: /
Disallow: /admin
Disallow: /login
Disallow: /api

Save this as robots.txt in your website root directory.

  • Protect admin and login pages -- Prevent search engines from indexing sensitive backend paths like /admin, /login, or /api that could expose your site structure or create duplicate content issues.
  • Control your crawl budget -- Adult sites often have thousands of pages. A robots.txt with crawl-delay prevents bots from overwhelming your server, ensuring your most important pages get indexed first.
  • Block AI scraping -- AI companies are scraping adult content to train their models. Blocking crawlers like GPTBot, CCBot, and ClaudeBot helps protect your original content from unauthorized use in AI training datasets.
  • Improve SEO performance -- By guiding crawlers to your valuable content and away from duplicate or low-value pages, you concentrate ranking signals on the pages that actually drive traffic and conversions.
  • Declare your sitemap location -- Including your sitemap URL in robots.txt ensures every crawler that reads the file also discovers your full site structure, improving indexation speed for new pages.