What Does BERT Mean For SEO?

by | Uncategorised

You’ve probably heard about BERT, Google’s new algorithm update, by now.

BERT, which stands for Bidirectional Encoder Representations from Transformers, helps Google understand and interpret natural language by improving the search engine’s comprehension of context.

As a language program, BERT was “pre-trained” on the entirety of Wikipedia, which is over 250 million individual words. With BERT, the search engine now understands the nuances of longer, wordier queries, helping Google find better results that match those queries more specifically.

It’s estimated that BERT will affect over 10% of queries, impacting organic rankings and snippets.

So it’s no small change for the SEO industry, and the AI industry is also buzzing. After all, BERT is not only an advanced natural language processing framework but it’s also open sourced, giving the entire field a chance to accelerate their own natural language research with this powerful tool.

While it has been pre-trained on Wikipedia, BERT is finetuned through the use of questioning datasets, and the program has been overperforming. In fact, BERT even beats the human reasoning benchmark on the most standard natural language understanding dataset, SQuAD (Stanford Question Answering Dataset.) Both Microsoft and Facebook are already developing their own BERT extensions.

What BERT Does

Words, when they stand alone out of the context of a sentence or conversation, can be very misleading.

Almost every word in the English language has multiple meanings. People are generally able to discern the meaning being used by the context it appears in. For example, the word “like” may be used as a verb, noun, or adjective. The word “like” itself has no meaning because it can mean whatever surrounds it. The context of “like” changes according to the meanings of the surrounding words.

Human speakers have no problem figuring out which version of “like” is being used, but ambiguous phrasing can throw search engines and machines for a loop. And the longer the sentence, the harder it becomes for the machine to keep track of what meanings should apply where.

BERT helps machines not only understand context of word meanings but also tackle longer sentences or queries, leading to more accurate, specific search engine results for users.

One of the most intriguing things about this new BERT model is that it works bidirectionally. In the past, language models could only move in one direction, making context hard to grasp. BERT’s bi-directional language modeling means it is able to read the whole sentence either way, grasping all the words at once. This can be key for innocuous words like pronouns or connectors like “to” or “for.”

What Does This Mean for Your SEO?

While some people might see an impact on their rankings, there’s not much you can do to out game BERT. Unfortunately, more accurate SERPS (search engine results pages) mean some pages will lose views they might have gotten from misdirection. But it does mean that the people who do find your page are probably actually looking for it, which should have positive effects on your conversion rates.

However, it also means that you need not rely so much on repeating keywords throughout your web copy, whether that’s blog posts or pages on your website, because Google will better understand your content and be better able to serve it up to those looking for it.