Forums › Forums › White Hat SEO › robots.txt: disallow sub directory which is a sub domain directory › Reply To: robots.txt: disallow sub directory which is a sub domain directory
-
dwchico
GuestAugust 16, 2020 at 2:53 amRobots.txt is meant to be placed in the root folder for domain to control access to folders and sub folders. Additionally, sub domains are treated as separate sites and require their own robots file.
So if you want to disallow said sub domain, create a robots file in the root directory for that sub domain and disallow the root.
User-agent: *
Disallow: /