Google recently announced the rollout of their “Bidirectional Encoder Representations from Transformers,” colloquially referred to as BERT. This is the most significant and arguably biggest algorithm update in natural language processing (NLP) since Google’s release of RankBrain in 2015. According to Google, this is a massive update that will have an impact on 10% of all search queries.
Google’s Vice President of Search Pandu Nayak said: “We see billions of searches every day, and 15 percent of those queries are ones we haven’t seen before — so we’ve built ways to return results for queries we can’t anticipate.”
BERT is Google’s response to the complex and conversational queries comprising that 15 percent.
WHO IS BERT AND WHEN DID HE ENTER THE SCENE?
BERT is Google’s open-source technology for natural language processing that allows anyone to train their own question and answering systems.
Essentially, Google is getting better at understanding the true nature of a user’s intent in order to serve more relevant results.
Using the BERT model, Google can now consider the full context of a word by looking at the words that come before and after it. Gone are the days of worrying about combinations or spelling of words you use in your searches.
Released over the course of last weekend, this algorithm update will continue to fully rollout by the end of this week. Currently, it only affects U.S. English language queries.
WHAT TO KEEP IN MIND FOR YOUR SEO STRATEGY
Don’t panic! There is nothing you can really do to “optimize” for this update, as it analyzes search queries — not webpages.
However, here are some post-BERT takeaways to keep in mind:
- Create content for humans, not search engines
- Know your audience and how they’re searching
- Craft purposeful content with a clear focus
- Provide ample context.
BERT is only in the initial stages, so continue monitoring your traffic and keyword rankings closely, identify any anomalies and look for opportunities to improve your top landing pages.