This article is the second part of a four-part series that will address the following:
- What is BERT in understandable language?
- How BERT affects SEO and content marketing
- The future of BERT and machine learning
- Strategies for adapting alongside BERT’s changes (COMING SOON)
Back in December, Google officially launched an update to the Google algorithm much larger than the core updates. We were told that it wouldn’t change much, but in reality, the update caused shockwaves across the board in terms of SEO, page ranks and analytics.
That was BERT and it’s revealing has made a big splash. Google reps have attempted to reassure us that we don’t need to change much. And to be honest, the lack of support around page optimization in terms of BERT has been a little disconcerting.
The bottom line is that BERT will absolutely affect how your content marketing strategy will deploy, the priorities in terms of page ranking, SEO relationships and your overall organic traffic.
For today I wanted to jump ahead to address tangible ways your marketing strategy should account for BERT. Later, I’ll get into why these effects are necessary (what BERT is doing quantitatively) and what we can expect for future BERTs.
BERT and Ernie?
You gotta love when Google makes you smile. No, BERT is not at all related to the lovable Sesame Street character. Instead, it’s an acronym for Google’s new search query algorithm. This new algorithm has been updated with decades worth of machine learning and developmental AI research but it still takes into account around 200 factors for SERPs and page prioritization.
It actually stands for Bidirectional Encoder Representations from Transformers, but in layman’s terms, this might not mean a lot. In essence, this algorithm has been programmed differently compared to other types of machine learning algorithms mainly in that it wants to learn about our sentence context. If you’d like to learn more about the ins-and-outs around BERTs development, then I recommend reading our previous post here.
The important take-away is that based on BERT, our queries should shift towards contextual language as opposed to keyword language. If you’re in charge of SEO, your company’s marketing strategy, or if your blog took a hit during this update, then you’ll want to make some changes.
Change Your SEO Strategy… sort of
If you’ve stayed up to date with search query trends over the last few years, or perhaps you’re a die-hard ToolBar PageRank historian, you might be familiar with the wide range of ranking factors that Google, and similar search engines, will take into account in terms of search query hierarchy.
In terms of content marketing, BERT should make decisions based on the same factors as before. Therefore you should not need to scrap your whole marketing plan, but you must make some adjustments so that your pages continue to rank in the long-run. You’ll also want to be taking this into account when developing new plans for 2020.
I very much enjoyed Brian Dean’s article on Backlinko that clearly broke down the 200 or so Google page ranking factors for 2019. I find it helps that he laid out each factor based on category or theme. There are nine major theme-factors to consider:
- Domain Factors
- Page-Level Factors
- Site-Level Factors
- Backlink Factors
- User Interaction
- Special Google Algorithm Rules
- Brand Signals
- On-Site Webspam Factors
- Off-Site Webspam Factors
I haven’t had the chance to verify which if any of the factors have been drastically changed for 2020, but on the outset, it appears consistent from last year. Monitoring web traffic through these categories can be useful when you want to isolate a particular branch of your marketing family tree.
Page-level factors are high-level factors and the level on which BERT effects sites the most. The three that I want to focus on are Latent Semantic Indexing Keywords in Content (LSI), LSI Keywords in Title and Description Tags, and Page Covers Topic In-Depth.
Latent Semantic Indexing Keywords in Content (LSI)
Latent Semantic Indexing Keywords in Content (LSI) is the hard-core of BERT. If you take nothing else away from this article, or from the effects of BERT on your SEO, it’s that LSI is super important. This is basically what BERT was intended to remedy.
LSI is a task essential to natural language processing (NLP – this is what BERT is made of). LSI takes in distributional semantics, which means that in natural language the AI will analyze the order, relationship and (to an extent) purpose of language sets in a wide variety of documents. For our understanding, documents can also refer to web pages. LSI will create concepts around language sets similar to themes or tags, so that it can catalog the documents based on word meaning and the meaning of the words close to those words.
So, yes, LSI can spit out a confusing matrix of relationality, and luckily we can use that. Just so you are aware, LSI relies also on TF-IDF (term frequency-inverse document frequency), which we know as the frequency a keyword is used in relation to the relative pages that talk about the term.
BERT is designed to take in contextual semantic language. It builds upon LSI by taking into consideration user intention. And I know what you’re thinking – no a machine does not have the ability to relate with intention, but it is continually being programmed to try to decipher and catalog different types of human intention.
When you run a page through the “BERT machine”, let’s say, it will begin running its algorithm. Reading a sentence for BERT means categorizing the words as keywords, related topics, related words in the sentence, words as concepts on that page, cross-referencing all of this with other webpages, and finally, it starts to understand common reasons for why someone would ask Google about cockatiels (because they talk!).
We can start to see how changing one simple aspect of the algorithm has a trickle effect. Things as simple as keyword density can be affected by BERT just because the priority of keywords has changed.
Throwing a keyword onto your webpage won’t up your SERPs by proxy of keyword density. This is one of the things that BERT aims to change – keyword spam. Similar to backlink spam, keyword spam ends up wasting space on our search queries because we have to take time to sift through all the (let’s be honest) worthless content.
Now, Google is actually prioritizing good content.
LSI Keywords in Title and Description Tags
So, now your content has changed to account for semantic structure, intention, and context. Let’s make that switch in our title and meta descriptions. So long as you understand what LSI means for your content, then the switch for LSI in meta will happen organically.
If we’re adjusting for context, then we want to make the same adjustments with most page-level factors. Title tags, H1 tags, meta-description, keyword frequency should all take into account your keyword context.
You’ll want to comprehensively and continually improve your metadata to make sure your strategy is aligned here.
Page Covers Topic In-Depth
I haven’t really found an authority out there who has touched this factor yet. And now that I think about it, covering a topic in-depth will take into account a better overall value in LSI, meta, and so forth.
Okay, so maybe I should switch this up… covering a topic in-depth is the hard-core of BERT. If you take nothing else away from this article, or from the effects of BERT on your SEO, it’s that in-depth topics are super important. This is basically what BERT was intended to remedy.
When you cover a topic in-depth you’re doing two really important things: you’re telling the reader that they deserve to know X amount about a topic, and you’re providing it in a comprehensive way.
There are obviously right ways and wrong ways to cover a topic in-depth. For example, too much academic jargon might make the reader feel bad about themselves. Or they may begin to think of that time when they were in 8th-grade biology and their teacher made them feel bad for not knowing about osmosis… The possibility is that covering a topic too in-depth might push a negative emotional effect on the reader, which is something that marketers sometimes want to avoid, or be able to control.
An in-depth article will provide a balance between informative, accessible, and approachable. This will help the reader to get on the same page as you and it may encourage them to open a conversation with your business. They may feel like they can trust your company because they know what it is that you sell and why the services/products/etc you are offering are worth owning.
You Can Relax a Little
If I’m being honest, the more I analyze BERT’s ripple effects, the more chill I become. Before when we look at our marketing strategies, the basic tasks in boosting SERPs sometimes clashed with what the client actually wanted. As a marketing team, we were forced to rationalize off-balanced content to mitigate the way the algorithm read our content based on numerous factors. In many ways, our content could become fragmented or just off on the tone of what a client wanted to speak to.
With BERT, I feel a little bit better knowing that our high-quality content is now being respected. No longer should marketing strategists feel as if they need to lower the content quality in order to gain respect, views, and positive ROI.