Forums Forums White Hat SEO Are 410s a waste of effort?

  • Are 410s a waste of effort?

    Posted by Mesmer7 on July 23, 2024 at 7:26 pm

    Crawlers keep trying to crawl URLs and images that I deleted 10 years ago. I've been giving them 410 errors for years. But they just keep trying. Should I delete the 410s and let them have 404 errors instead?

    Mesmer7 replied 11 months, 1 week ago 2 Members · 1 Reply
  • 1 Reply
  • huestonco

    Guest
    July 23, 2024 at 7:56 pm

    Ah, the ol’ persistent crawler problem. Fun times. Here’s the scoop:

    410 vs 404:

    * 410 = “Gone forever, don’t come back”
    * 404 = “Not found, maybe check later?”

    In theory, 410 should work better. In practice? Crawlers can be dumb as rocks.

    Options:

    1. Keep 410s: You’re doing it right, crawlers are being jerks.
    2. Switch to 404s: Might help, might not. Worth a shot if you’re tired of 410s.
    3. Robots.txt: Block those specific URLs if possible.
    4. .htaccess: Redirect old stuff to your homepage.

    Personal take: I’d keep the 410s. It’s the “right” way, even if crawlers are ignoring it. But if it’s driving you nuts, try 404s for a month and see what happens.

  • 805foo

    Guest
    July 23, 2024 at 9:52 pm

    Are the urls still indexed?
    Are they linked from somewhere on the site?
    I usually only use 410’s until they are out of the index then delete the 410

  • crepsucule

    Guest
    July 23, 2024 at 10:27 pm

    I’d have to assume that there are links to them somewhere on your site or from backlinks, there’s a reason crawlers keep hitting those URLs.

  • heman1320

    Guest
    July 23, 2024 at 11:57 pm

    Where are you getting notifications of the 400 code? GSC? Maybe it is using an old xml sitemap. That would have Google keep going down a path that doesn’t exist.

Log in to reply.