SEO Strategies That Use Natural Language Processing



To understand just how vital NLP will be to the future of SEO, it’s worth looking at how Google – in particular – implements NLP and just how central it is to their mission.

One of the issues that researchers have faced when it comes to convincing businesses of NLP’s value is that the idea of using these models to read real-life text is inherently dated. As Dixon Jones, CEO of Inlinks.net, recently put it, though, this is a huge mistake: “When people realize NLP stands for Natural Language Processing instead of some 1970s hypno-mumbo-jumbo,” he said, “they’ll realize that not only is it here to stay, it’s the very bedrock of the mantra organizing the world’s information.”

Google’s NLP approach is built on a ground-breaking language processing model: BERT (Bidirectional Encoder Representations from Transformers). BERT is outlined in a recent paper published by researchers at Google AI Language. Its publication caused quite a stir in the Machine Learning community by presenting state-of-the-art results in a wide variety of NLP tasks, including Question Answering (SQuAD v1.1), Natural Language Inference (MNLI), and others.

BERT’s key technical innovation is applying the bidirectional training of Transformer, a popular attention model, to language modeling.

While that might seem technical, BERT’s impact on SEO can be put pretty simply: Google no longer looks at words or phrases individually, as we understood in the past when it would traditionally run keyword research. Now it looks at sentences, paragraphs, and the query as a whole. In other words, the algorithm looks at the sentiment or overall intent rather than focusing on individual words.

Google’s official Search Liason recently revealed on Twitter that Google BERT is now helping with one out of every ten google searches in the US in English, with plans to expand soon to include searches in more countries and languages.





Source link

Related Articles