Google's BERT Update
About a week ago, Google announced they made an update to the algorithm and have called this update BERT.
For the duration of this post – we will call the update by its first name: BERT.
BERT is mostly a language algorithm update that is centered on understanding the language behind the user’s intent. Google has noted this update is to help more extended, more conversational queries. This is Google’s attempt to better understand the context behind a search query instead of just keywords or search terms.
BERT got its name from the technology behind this rather new neural network called “Bidirectional Encoder Representations from Transformers” or, as we have been saying: BERT. BERT update also marks the first time Google is using its latest Tensor Processing Unite (TPU) chips to serve search results. BERT uses Word2vec tactics called masking, where a random word in a sentence is hidden. Since BERT is bi-directional – it looks at words before and after the disguised word to predict what the masked word is. It does this over and over again until it gains the proper momentum to predict the masked words.
So, how do you optimize for BERT?
Well, at this moment, we can’t. As Google, as said time and time before – just write good quality content for your users. Here are a few tips on how to write good quality content:
- Create content for the user first (search engine second).
- Don’t create content that is clearly keyword-stuffed and over-optimized.
- Create content that is for every stage for your buyer’s journey.
- Don’t use duplicate content.
Have you seen a fluxation of your rankings or website traffic since the announcement of BERT? Connect with me on socials @digitalsargeant and let me know!
Special shoutout to Brittany Muller for her outstanding presentation on BERT on Moz’s Whiteboard Friday!