Forums Forums White Hat SEO Google stop indexing our site and removed all our product

  • Google stop indexing our site and removed all our product

    Posted by Tall-Carpenter499 on September 14, 2025 at 7:11 am

    Hello,

    I have been ranking number one at the top of my industry for over 3 years, with about 1,500 products indexed. Normally, my products appear in Google within hours after publishing.

    However, just a few days ago, all our products disappeared from Google search results. Currently, only the homepage remains indexed. Google Search Console shows “Sitemap couldn’t fetch,” yet when I test with Googlebot, it returns 200 OK.

    • We use RankMath for SEO.
    • Sitemap is accessible in the browser.
    • Server responds correctly to Googlebot..

    Please, how can I resolve this? It’s affecting visibility and sales urgently.

    Thank you.

    Tall-Carpenter499 replied 54 minutes ago 2 Members · 1 Reply
  • 1 Reply
  • [deleted]

    Guest
    September 14, 2025 at 1:57 pm

    [deleted]

  • mayazir

    Guest
    September 14, 2025 at 3:53 pm

    You can’t do nothing. Google is a fcking monopolist who converted indexation in a vip club. They removed my 3 old news sites. For nothing. Just removed. Because they wanted. I dedícated 8 years to my sites. Fck Google!

  • UP-SEO

    Guest
    September 14, 2025 at 4:36 pm

    This is a stressful situation, but the fact that the tests are passing is a good sign. It’s usually fixable.

    The contradictory error (“couldn’t fetch” vs. “200 OK”) points to one of two very common issues.

    1. Something is Blocking Google’s Main Crawler. This is the most likely cause. A security plugin, a server firewall, or a CDN is likely blocking Google’s automated crawler but allowing the manual test through. They come from different IP addresses.
    • What to do: Check your security plugin and firewall logs to see if Googlebot is being blocked. Try temporarily disabling the security plugin, then ask Google to fetch the sitemap again in Search Console. If it works, you’ve found the culprit and need to whitelist Googlebot.

    2. An Accidental “noindex” Tag. A recent update to a plugin or theme might have accidentally added a noindex tag to all your products. This tells Google to remove them from search results entirely.
    • What to do: Go to one of your product pages on your website. Right-click and choose “View Page Source.” Use the search function (Ctrl+F or Cmd+F) and look for the word noindex. If you see a line like <meta name=”robots” content=”noindex”>, that’s your problem. You’ll need to find the setting in your SEO plugin (RankMath) or theme options that is causing this.
    Start with these two checks. They account for this exact problem the vast majority of the

  • AbleInvestment2866

    Guest
    September 14, 2025 at 5:21 pm

    You need someone to check your website, but you obviously have a coding or server issue. The fact that you can access files directly but Googlebot can’t is key. This “Sitemap couldn’t fetch” error means that, for some reason, Googlebot is stopped before even rendering. Most likely reasons (among dozens) are:

    * Corrupted sitemap.xml
    * Problems in your .htaccess or config file (in NGINX)
    * Problems in your server headers
    * Bad implementation of JS injection (this would be the most likely one since it explains 70–80% of the cases when this happens, but it shouldn’t affect sitemap.xml)
    * Some kind of badly configured blocking, especially if using CDNs
    * If your products are in GMC, then it’s almost surely a wrong schema, but this happens only with GMC AFAIK
    * (Add dozens more)

    Anyway, if you did any change at code level, added plugins, or whatever, undo it and resubmit your sitemap, and it should work. If not, you’re dealing with server responses. Or a mix of both, who knows. You provided zero data, so I guess losing sales isn’t that big a deal, really.

    Bottom line: pay someone who knows, or watch your sales go bye-bye.

    PS: in response to this:

    * Sitemap is accessible in the browser.
    * Server responds correctly to Googlebot..

    It doesn’t matter. People always think this means something, but actually it doesn’t, because your browser and your interactions are things Googlebot does not have. The minute you moved your mouse 1/1000000 of an inch, your browser became the complete opposite of the Googlebot client (not a browser but an HTTP client built in C++ and Go), so any test will be random and could be equally wrong or right.

  • WebLinkr

    Guest
    September 14, 2025 at 5:47 pm

    My previous reply assumed there was no error catching your pages.

    Is this in Google Merchant.

    Your sitemap wont force googel to index pages or improve indexing/indexation unless you have lots of orphan pages. (even still) – this is really only good for news and large authority sites (like DA >30)

    # Questions Needed to ascertain the issue

    Do you get errors reading other pages? Like if you inspect them

    Are they indexed

    Dd you have clicks to those pages?

Log in to reply.