Forums Forums White Hat SEO AI/LLM Search relies on classic ranking and retrieval | Google

  • AI/LLM Search relies on classic ranking and retrieval | Google

    Posted by WebLinkr on February 18, 2026 at 4:57 pm

    Jeff Dean says Google’s AI Search still works like classic Search: narrow the web to relevant pages, rank them, then let a model generate the answer.

    In an interview on Latent Space: The AI Engineer Podcast, Google’s chief AI scientist explained how Google’s AI systems work and how much they rely on traditional search infrastructure.

    The architecture: filter first, reason last. Visibility still depends on clearing ranking thresholds. Content must enter the broad candidate pool, then survive deeper reranking before it can be used in an AI-generated response. Put simply, AI doesn’t replace ranking. It sits on top of it.

    Dean said an LLM-powered system doesn’t read the entire web at once. It starts with Google’s full index, then uses lightweight methods to identify a large candidate pool — tens of thousands of documents. Dean said:

    • “You identify a subset of them that are relevant with very lightweight kinds of methods. You’re down to like 30,000 documents or something. And then you gradually refine that to apply more and more sophisticated algorithms and more and more sophisticated sort of signals of various kinds in order to get down to ultimately what you show, which is the final 10 results or 10 results plus other kinds of information.”

    Stronger ranking systems narrow that set further. Only after multiple filtering rounds does the most capable model analyze a much smaller group of documents and generate an answer. Dean said:

    • “And I think an LLM-based system is not going to be that dissimilar, right? You’re going to attend to trillions of tokens, but you’re going to want to identify what are the 30,000-ish documents that are with the maybe 30 million interesting tokens. And then how do you go from that into what are the 117 documents I really should be paying attention to in order to carry out the tasks that the user has asked me to do?”

    Dean called this the “illusion” of attending to trillions of tokens. In practice, it’s a staged pipeline: retrieve, rerank, synthesize. Dean said:

    • “Google search gives you … not the illusion, but you are searching the internet, but you’re finding a very small subset of things that are relevant.

    There's been so much disinformation from GEO tool providers – and they seem to be getting more and more aggressive but the simple reality remains – LLMs are not search engines.

    WebLinkr replied 8 hours, 19 minutes ago 1 Member · 0 Replies
  • 0 Replies

Sorry, there were no replies found.

Log in to reply.