Forums Forums White Hat SEO Has anyone else recently encountered the “couldn’t fetch” issue?

  • Has anyone else recently encountered the “couldn’t fetch” issue?

    Posted by jackslovakia on August 14, 2024 at 6:30 am

    A few weeks ago, I addressed the issue of our homepage URL disappearing from branded search results because we were attacked by spammy backlinks. I blocked around 10,000 domains.

    Now, I’m facing a new problem. Our website has about 170,000 links, and I requested the FE team to break them down into 24 sitemaps according to different languages. When submitting them to Google, it shows as successful, but the status column says 'couldn't fetch.'

    However, Bing has already shown success. Does anyone know why this might be happening and how we can optimize this issue?

    jackslovakia replied 10 months, 4 weeks ago 2 Members · 1 Reply
  • 1 Reply
  • Omega-marketing

    Guest
    August 14, 2024 at 9:38 am

    Man, no millions of bad/toxis/unethical backlinks can damage your ranking, you’ve wasted your time and just removed links that were affecting your SERP positions too. There is no such thing as “spammy backlink attack”. Just forget that stupid popular BS. If it were that easy, then anyone could just spend $500 and get 100000 scam/porn backlinks to take down any website from search. It never worked that way. It’s just unprofessional opinions floating around.

    170k URL’s in one sitemap / 24 files? it seems normal. Have you tried opening sitemaps at the same submitted location? If it’s ok, check if there are any rate-limiting or WAF rules preventing access. Create new sitemaps with different names and resubmit to GSC (do not remove older ones). Check site access logs for attempts to retrieve the files for errors.

    If fetched ok with no server-side errors, then just wait. It’s normal, Google sometimes needs more time to get your files.

Log in to reply.