Forums Forums White Hat SEO robots.txt: disallow sub directory which is a sub domain directory Reply To: robots.txt: disallow sub directory which is a sub domain directory

  • dwchico

    Guest
    August 16, 2020 at 2:53 am

    Robots.txt is meant to be placed in the root folder for domain to control access to folders and sub folders. Additionally, sub domains are treated as separate sites and require their own robots file.

    So if you want to disallow said sub domain, create a robots file in the root directory for that sub domain and disallow the root.

    User-agent: *

    Disallow: /