How BERT Will Affect SEOs in 2020

How BERT Will Affect SEOs in 2020

Whether just beginning to learn about SEO or you’re an active professional, it’s not difficult to see that the industry changes quickly. And one of the current shifts we’re witnessing is a lot of commotion on BERT. No, I’m not talking about the iconic Sesame Street character (though they might be happy about the search increase). I mean Bidirectional Encoder Representations from Transformers (BERT).

So if you are curious to learn a little bit more about BERT, or how to explain it to clients, then read on for a quick recap of what we know today.

What is BERT?

Speaking technically, BERT is a pre-trained unsupervised Natural Language Processing model. Which is akin to a rocket booster for Natural Language Processing (NLP).

In order to better understand BERT, it is critical to grasp the broader context surrounding where BERT came from and where it’s going. These types of algorithm announcements within the SEO industry often resemble a still frame within a movie. Without being exposed to the movie scenes before and after, it’s difficult for the public to truly understand the meaning behind that isolated frame.

With this BERT announcement on the table, let’s step back to better understand its significance and what events led to its development.

Natural Language Processing 101

Computers have always struggled to understand language. They were able to conquer tasks like storing and entering text, but language capabilities were faced with difficulty. That is until NLP came along. This field involved researchers developing unique models to solve specific areas of language understanding, such as named entity recognition, classification, sentiment analysis (which understands if the statement is positive or negative), question and answering, etc.

Those language understanding problems were traditionally solved by individual models each designed to solve a specific language task, similar to your kitchen utensils each solving a specific cooking-related task. A wide array of NLP models have similarly been created to solve individual language problems. Although each utensil/model can complete its own particular task very well, it’s limited in that it can only achieve that singular task.

Now consider BERT.

BERT combines 12 of your most frequently used utensils, or in this case: NLP solutions, into one be-all-end-all kitchen utensil to successfully execute each task.

This is why people are so excited about BERT. With this shift, they no longer have to deal with the clutter that comes with managing individual models. Instead, BERT as a singular model is better able to understand language more holistically after fine-tuning. This exciting differentiation in the NLP space solves 12 of the most common NLP tasks, which is why Google chose to incorporate it into its algorithm.

But Wait: Don’t Hype This up Just Yet. Here’s What Matters.

Now, With a better grasp on the plot that led to BERT’s release, Natural Language Processing researcher and University of Chicago professor, Allyson Ettinger, kindly shed light on this innovation’s future:

“I think we’ll be heading on the same trajectory for a while building bigger and better variants of BERT that are stronger in the ways that BERT is strong and probably with the same fundamental limitations.”

But it’s vital to understand that this is a perspective for the future.

In my discussion with Allyson, the primary takeaway was it’s essential not to over-hype BERT. A lot of commotion followed its release, but in reality, it’s still far from understanding language and context the way humans do. And while that might sound deflating, that does not take away from the fact that it’s a monumental advancement in NLP and Machine Learning.

So what should SEOs and Digital Marketing professionals care about today? The sole method to improve your website ranking using this update is to write really great content that fulfills the intent users seek.

A repeated question surrounding this announcement is how one can optimize for BERT? Well, the answer is you can’t. But to keep you occupied until the opportunity for more action arises, here are some of my favorite resources on the topic of NLP and BERT.

Read More: Can AI-Fueled Digital Marketing Change Landscape and Fuel Growth?

Picture of Britney Muller

Britney Muller

Britney Muller is a Senior SEO Scientist at Moz.

You Might Also Like