Forums Forums White Hat SEO Indexing Issues on Website Since a Rebuild

  • Indexing Issues on Website Since a Rebuild

    Posted by knifezoid on March 25, 2026 at 5:35 pm

    Back in Sept 2025, we had our website "rebuilt" by converting the entire site built with Elementor exclusively. Previously it had used 3 different website builders, which made it hard to manage.

    After the rebuild, I noticed our traffic had completely stopped. For some reason, the people who rebuilt the site had the Search Engine visibility checked so that our site was NOT discoverable. I unchecked the box and noticed traffic starting to pick up again.

    However, I feel something still isn't right. I've been publishing a lot of new articles and pages, and while our traffic is getting back to where it used to be, there's still an issue with our indexing. I think it is preventing our website from being seen as much as it should.

    We use RankMath and have 'Instant Indexing' on. But the pages are still reporting as not indexed.

    Is this affecting our web traffic? And how do we fix it? Any advice is much appreciated!

    I have a few pictures of the indexing status of many of our posts/pages.

    knifezoid replied 2 weeks, 6 days ago 2 Members · 1 Reply
  • 1 Reply
  • ThirdEyesOfTheWorld

    Guest
    March 25, 2026 at 5:39 pm

    >For some reason, the people who rebuilt the site had the Search Engine visibility checked so that our site was NOT discoverable. I unchecked the box and noticed traffic starting to pick up again.

    Did they rebuild it on a staging server first? It’s normal to discourage indexing on a staging site, but clearly they forgot to uncheck that when pushing to production. Rookie mistake.

    >I’ve been publishing a lot of new articles and pages, and while our traffic is getting back to where it used to be, there’s still an issue with our indexing. I think it is preventing our website from being seen as much as it should. We use RankMath and have ‘Instant Indexing’ on. But the pages are still reporting as not indexed.

    Do you have Google Search Console? It will tell you the reasons for pages not being indexed. If there’s no technical reason for it, just submit the URLs for indexing through the GSC tool. I’d also check that your sitemap is up to date and functioning properly.

  • bkthemes

    Guest
    March 25, 2026 at 5:50 pm

    Sounds like Google does not find the content original enough. You need to follow EEAT practices closely these days to get indexed. Just a rehash of what is already on the internet is probably not going to get you indexed. Readers need to leave your post with some real knowledge not found anywhere else.

    They recommend first person as well. We, I, me. ‘My test with this was…’ is just one example of information exclusive, not found anywhere else. This is part of the Experience of EEAT.

  • BoGrumpus

    Guest
    March 25, 2026 at 5:51 pm

    Unfortunately, once a site is out more than a few weeks, it takes a while to come back. It takes a while for all the off page math to be connected and calculated and set to properly ranking things.

    As for instant indexing – turn that off. Instant Indexing is for time sensitive things. So you can use it for things like daily box score updates or a news article where it’s useful and of interest now, but may not be in the future.

    But if you’re just using it for regular content updates or things you want to be evergreen – it’s either going to index it instantly, but give it no long term value (which I suspect is in part where the “it has to be fresh all the time” myth is coming from – because everyone is telling Google that everything they publish is only important right now – not later).

    OR, it realizes you’re just churning out content and wanting it to rank fast, so it eventually loses interest in you at all.

    So yeah. Turn that off. You’re likely not abusing it, but so many are and at such a massive scale, it’s not helping you to have that as something the AI has to consider.

    Just sit back, if you post content, make it something you feel you need to tell your customers and be helpful for making them choose you or to trust you more. And just let the bots come back and figure that out without a lot of noise.

    Since “humanizing AI” is all the rage now, the TLDR here is:

    Be the person at the party the AI wants to hang out with and learn about. Don’t be that kid who runs around all day screaming, “Hey! Look at me look at me!”

    That kid is going to get nothing but a timeout and a nap.

    And never use instant indexing for the wrong thing. Separate problem, but possibly slowing down your recovery – you just don’t want to be sending the signal that your content has an expiration date or it will take you at your word for it.

    G.

  • SEOPub

    Guest
    March 25, 2026 at 6:23 pm

    Definitely turn off instant indexing. I can’t confirm that Google crawls sites less frequently that abuse it, but I would suspect they do. Unless you have a news site or something like a sports website sharing live scores and events, instant indexing is not meant for you.

    I wouldn’t even bother with the crawl request in GSC. I think that just trains the crawler not to visit your site as often because you will notify it when there is new content.

    As for the traffic and indexing issue… Elementor is kind of bloated. I don’t know what the site was using before, but that could be a slight issue, especially on a larger site.

    A more common issue I have seen when a site goes through a complete rebuild is the developer uses the wrong WWW or non-WWW version of the site. In other words, it was previously http://www.somedomain.tld and now it is on somedomain.tld dropping the WWW, or vice versa. Even with proper redirects in place, this will cause traffic to lag behind where it was for quite some time.

    Another possibility is things drastically changed with the rebuild. The structure of pages may have been changed. Internal links dropped. Pages not migrated over. Pages purposely pruned but no redirect put in place. And a host of other potential problems.

  • First-Bumblebee-9600

    Guest
    March 25, 2026 at 7:31 pm

    yeah that can absolutely suppress traffic if Google still isn’t processing the site cleanly. i’d check page source for stray noindex, robots.txt, canonicals, sitemap freshness, and inspect a few affected URLs in Search Console. instant indexing helps discovery a bit but it won’t override bad technical signals

  • MinimumCode4914

    Guest
    March 25, 2026 at 7:37 pm

    inspect those urls in search console with the url inspection tool. it’ll show the exact reason google isnt indexing them after the rebuild, like a lingering noindex or crawl issue. that ones probably still limiting your traffic.

  • WebLinkr

    Guest
    March 25, 2026 at 8:21 pm

    Hey u/knifezoid

    Sorry to hear about this. Actually this happened to a site I managed about 1 months ago.

    > I unchecked the box and noticed traffic starting to pick up again.

    Great. So this advice is relatively high level but here’s what I suspect is happening.

    So essentially, from a semantic PoV – your whole site was marked as NoIndex. The rate at which pages are resumed will/can/most likely/maybe caused another issue

    1. Google dosnt support “IndexNow” or “InstantIndexing” – you can turn it on/off it won’t matter. You cannot optimize for crawling or “de-optimize” for it – you just have no control. Even if you “build” a million inbound links – if they’re on pages on low frequency crawl ranges -guess what – they’re in a pool with more links/less crawlers.

    2. I found with NoIndexing a few things.

    Your pages that had clicks will be in the most frequently crawled pool – this is true of every site/page from CNN down (assuming CNN/NY Times/Microsoft are seed sites at the top of the trust/authority pyramid in Googles current PageRank model). Effectively – those are the pages that “turned back on” and returned to normal status

    The pages that didn’t have clicks or frequent clicks – aren’t going to be automatically refreshed – so there’s no way to “alert” G that the noindex switch has been turned off or back to on

    I would recommend that each day you manually request crawling for the most important pages – based on their click/impression volume. This might also give you a chance to look over each one.

    3. New Cannibalization **sequencing** issues. So if you had a page ranking for say “/agentic_siem” and it ranked fine for all things “Agentic SIEM” and then you published a blog about “AI SIEM with Agent Control” – that might never have had a chance to block each other cos historic CTR etc.

    However – if the page had less organic clicks – and therefore wasn’t returned to the index BEFORE the blog posts- the blog post might have started ranking for “agentic siem” but in a worse position and then when you index the other page – it had a CTR history (the most common cause of big C) – two pages that never competed now do.

    4. Change in Topical Authority: This almost certainly happened: if you had 300 pages and 100 pages that were ranking with mild click volumes – you’ve potentially lost “passive” topical authority. Basically – the pages had weak independent authority but together linked to a page passing enough for that page’s contextual surroundings to rank – and without it, it doesn’t. In turn – if that page was sending authority to a neighbor or a blog post or landing page – it now has less or its TA makeup lost context, meaning its PR transfer is weakened. Authority is like electrical current – so its like a knock-on effect.

    Hope that helps – happy to explain anything further if you need!

Log in to reply.