This morning Google announced that starting earlier this week, throughout the week, Google has been pushing out what it calls “biggest leap forward in the past five years, and one of the biggest leaps forward in the history of search.” That is the BERT algorithm update, which is Google’s way at better understanding one out of ten queries, in a way humans understand them.
I covered it in detail at Search Engine Land but let me sum it up in bullet point fashion for those quick readers:
(1) BERT began rolling out earlier this week and will be fully live by the end of this week. It is live for English language queries.
(2) It also impacts featured snippets in a big way, and not just for English language featured snippets like with core search but many different languages.
(4) BERT is a lot like RankBrain, in that it is a machine learning algorithm that aims to better understand queries and content on a page.
(5) BERT technically is neural network-based technique for natural language processing (NLP) pre-training. BERT stands for Bidirectional Encoder Representations from Transformers. Google wrote about it last year in more detail over here.
(6) BERT allows Google to understand queries that are more human like, queries and content that are more natural language and conversational based. BERT helps Google understand the nuances and context of words in searches and better match those queries with more relevant results.
(7) Google says this is massive, massive in that it impacts 10% of all queries but also Google said this is its biggest steps forward for search in the past 5 years, and it’s one of the biggest steps forward in the history of search altogether.
(8) We definitely noticed changes this weekend and mid week but did it feel as big as Panda or Penguin (I know they are totally different beasts) – no, it did not. Why? I assume because SEOs can’t optimize for it 🙂 and thus it feels like towards SEOs.
(9) Google said they tested BERT in a big way and the company is seeing seeing significant benefits to using it.
(10) It does not replace RankBrain or other language algorithms, it can be used in conjunction with them.
(11) I assume you cannot optimize for BERT like you cannot optimize for RankBrain – just write for humans.
(12) Neural matching is different also, see this story.
Here are some before and after examples of BERT in action; note, these results may not be live because these screen shots were taken earlier and the search results change rapidly:
And a featured snippet example:
BERT Doesn’t Feel Huge
Again, the chatter and tracking tools have shown changes this weekend and mid week but again, it is not as significant as a core update or other update – at least from an SEO’s perspective – at least yet. This is based on the signals I track…
Bill Lambert Right Again?
Bill Lambert character seems to be spot on when calling it a game changer and was spot on when saying when this would roll out. Even when I said the chatter isn’t huge, he stuck to his guns and said the update is going on now. Just interesting, don’t you think?
Early SEO Reaction
Here are some tweets:
Unfortunately however, some copywriters don’t even have a point to make in their posts and pages. Focus for humans and search engines will understand. Vagueness is problematic for everyone / thing https://t.co/6GFsZ76Dls
— Dawn Anderson (@dawnieando) October 25, 2019
Bert training ⬇️⬇️ pic.twitter.com/mxXr3vBWSU
— Gianluca Fiorelli (@gfiorelli1) October 25, 2019
Upcoming publications in the SEO community: “How to get out of BERT”
— Pedro Dias: ~/pedro$ (@pedrodias) October 25, 2019
So far we have very little chatter in the community and Googlers have not responded yet to BERT related questions on Twitter. I did however ask Google numerous questions via email and Google replied to all of them, to write this and my Search Engine Land article.
Forum discussion at WebmasterWorld.