Forums Forums White Hat SEO What to do with a site that blocks crawlers?

  • What to do with a site that blocks crawlers?

    Posted by hoac1231 on November 17, 2022 at 5:26 am

    I;m trying to audit one of our client’s site and it’s giving me 403 in screamingfrog. Also, when I check the indexed pages report in SEMrush and select broken pages, I get a list of all the site’s pages with the same error 403.

    hoac1231 replied 2 years, 7 months ago 2 Members · 1 Reply
  • 1 Reply
  • bsasson

    Guest
    November 17, 2022 at 6:59 am

    Ask the site to whitelist your IP.

  • scarletdawnredd

    Guest
    November 17, 2022 at 7:18 am

    Either ask them to whitelist or changed the user agent on SF

  • SEO-Injection

    Guest
    November 17, 2022 at 10:35 am

    Change user agent on SF to GoogleBot

Log in to reply.