Forums Forums White Hat SEO How do I best prevent duplicate content/SEO dilution for replica websites with different top-level domains?

  • How do I best prevent duplicate content/SEO dilution for replica websites with different top-level domains?

    Posted by seohelper on January 28, 2021 at 7:37 pm

    Hey guys, I have read a bunch on this and am still having a tough time figuring out the best way to solve my problem. If anyone could help I would appreciate it a ton!!

    My problem is that I have a .com website, but for various reasons we have to make a .us website as well. However, it is just a formality, the .us domain doesn’t have to rank on google. So I *think* I have two options:

    1. If I want to keep ALL of the SEO credit (or as much as possible) in the .com website and I don’t care about the .us domain being searchable on Google, is it better to use a) robots.txt to prevent crawling or b) a no-index meta tag to prevent indexing?

    2. If I still want my .us domain to be searchable on google, but give as much SEO credit back to the .com domain as possible, is it better to use a) canonical tags or b) 301 redirects?

    Ideally I would want the .us domain to be searchable on google, while passing ALL of the SEO credit to the .com domain, but if that’s not possible and it hurts the SEO rankings of the .com to a noticeable extent then I don’t mind it being unsearchable.

    I hope I am not too far off base in my thinking, but I have a very small team and am solely responsible for this – so any help would be awesome. Thanks!

    TheMacMan replied 5 years, 2 months ago 1 Member · 3 Replies
  • 3 Replies
  • TheMacMan

    Guest
    January 29, 2021 at 12:32 am

    Have canonicals pointing to the “original” or place you want to get the credit. So every duplicate page on the .us site would include a canonical link to the .com page it matches. Google will then give credit to the .com page and only the .com pages should appear in search.

  • pontarae

    Guest
    January 29, 2021 at 12:50 am

    TL;DR – Publishing a website full of exactly duplicated content would be ***a disaster waiting to happen*** and ***it should not be done.***
    —–

    Google will provide ranking opportunities for **unique content only.** In most market segments, **only the first published instance** of content will rank well.

    In Re Question 1:
    • If you want your .COM domain to receive ALL SEO credit and rank well then do not create or publish a .US domain full of copied content.

    • If the .US domain is not searchable/indexable by Google there will be no “SEO credit” produced and no rankings whatever for the .US website.

    • If the .US content IS or BECOMES searchable/indexable then the ranking potential for the .COM website will be impaired, possibly completely de-ranked.

    In Re Question 2:
    Using a *canonical tag* declares the source URL for the content and therefore where your “SEO Credit” would be applied. There can only be ONE source of the content.
    • Using a *301 redirect* sends all traffic, *and all “SEO Credit”* to the indicated URL.

    —–

    I have a couple of questions whose answers might allow me to suggest a way forward:
    • what is your target market i.e. where does the .COM site need to rank?

    • If your .US website doesn’t need to rank, what is the corporate justification for building it?

  • ced_narrator

    Guest
    January 29, 2021 at 2:41 pm

    Having duplicates is not a good idea even if you’re giving google canonical urls to the .com version.

    The way this is normally done is with a 301 redirect. Any user hitting anything on the .us domain should be automatically redirected to the same page on the .com domain. This is a super easy rule to set up with something like Cloudflare. If you’re unsure how any web developer managing a web site can do it super easily.

    If you ever see a site that uses https (i.e. most), you’ll see that if you enter the http version of the site you’ll automatically be redirected to https. Same exact thing.

Log in to reply.