What is Google BERT?
Obviously, before understanding what changes, we have to be clear about what is hidden behind those acronyms. Because, even if it looks like a proper name, we are actually talking about an acronym.
The initials of Bidirectional Encoder Representations from Transformers make up the famous word we are talking about. The truth is that readers of the pull it is perfectly understood that they have chosen to find a kinder solution to refer to it.
If we look at the definition of BERT is an open-source neural network, we can begin to understand better what we are talking about. At least, we are clear that we are referring to a system that uses Artificial Intelligence to improve search results.
What is BERT for?
Behind BERT is Google’s quintessential workhorse: to provide its search engine with the ability to process natural language (NPL) in the same way that users are able to interact with other people.
This, in itself, is nothing new, because we have known for years that they “train” their algorithm to understand users. Algorithm updates such as Google Hummingbird were already in line with making the semantic web a reality beyond simple keywords.
We also knew that they use machine learning through their well-known Rankbrain to, through these artificial neural networks, simulate the behavior of the human brain and learn for themselves to refine searches and offer more relevant results based on this knowledge.
So … what changes BERT?
Well, as usually happens in Google, everything is related. The emergence of this new technology implies the ability to specify the semantic web and neural networks, in a formula to understand the context of a word and, thus, interpret it in a more efficient way.
How does Google BERT work?
The key is given by the construction of the word itself, we speak of a two-way transformer encoder. When we say that it is bidirectional, we mean that it is able to establish the relationship between all the words in a sentence, taking into account the sense in which some modify the others.
We are not talking about reading the whole sentence in the semantically logical sense, from left to right, and then re-reading it in the opposite direction. Bidirectionality is in the process of contextualization.
How does it affect SEO?
It is not difficult to understand that decisively. In the example just mentioned and the others referred to by Nayak in the post, we see how the keyword itself ceases to have an isolated value.
Until now, we have understood that the user is looking synthetically for the economy of effort: better “cheap hotel Madrid” than writing “hotel that is not too expensive in Madrid capital”. The virtual assistants have made the search is altered: the search voice is a reality.
In any case, people continue to use such searches in a high percentage in which they discard articles or prepositions.
In part, it is my habit, but in Google, they point to another interesting aspect: 15% of searchers do not correspond to anything that had been searched before, they are unpublished. This means that not even the user can articulate a complex search, he needs to rely on the context to reach the resolution of it.
We have to be attentive to the impact that occurs. Predictably it will be elevated globally, and more as other languages are added to English, as it is expected to happen progressively.