Are the words and phrases in your content perfectly optimized for both Google and your visitors?
Well, if you use Surfer, your answer is probably “yes.”
But what if I told you that we came up with a solution to make our content recommendations more accurate, more intuitive, and even more likely to get you right to the top of the SERPs?
Oh, yes, that’s exactly what happened.
I'm super excited to introduce our new algorithm for content analysis.
After months of epic brainstorming, hard work, and intense testing, we’ve developed a brand new logic that incorporates AI, Machine Learning, and the page's rendering.
But how does it work, and, most importantly, how will it help YOU get better results?
How to perform SEO research that does its job and doesn’t take hours?
There is a ridiculous belief that writing for users conflicts with SEO.
This couldn’t be further from the truth.
To create perfect content, we have to use language that appeals to:
- our users, so that we turn them into potential customers and brand advocates,
- and Google algorithm, which evaluates the website's quality and relevance regarding the given keyword. If the evaluation is positive, we have a chance to increase our position in search results.
But this requires solid research. To prepare a content outline, SEOs had to perform complex research and analyses to determine users' intentions and requirements, and please the algorithm simultaneously.
Such research is filled with manual work:
- verifying competitors,
- learning new topics,
- visting the most relevant and frequently asked questions,
- and more.
One of the essential parts of such research is preparing a list of topics to mention and exact terms to use. But it requires hours of analyzing content, an eagle eye for details, and nearly divine comparison skills.
For me, selecting prominent words from Google's SERP (Search Engine Result Page) had always been a daunting and time-consuming task. Still, it was worth the labor. The more effort I put into my research, the more compelling and algorithm-pleasing the articles I outlined were.
The rest of the team and I decided to create Surfer to do the menial tasks for us. Its goal was, and still is, to automate, enhance, and speed up the whole research process and generate top-notch guidelines.
All in all, the advantages of using Surfer for SEO research rather than your own two hands are:
- Shorter analysis time: Surfer can crunch what would take hours to do by hand in a matter of seconds.
- Foolproof results: to err is human. Automated tools can help us avoid silly mistakes, that we often do when the deadlines are breathing down our necks. And I’m sure every SEO knows this feeling.
We want Surfer to help you do top-quality work in a matter of minutes.
Evaluating words and phrases with Surfer
It’s almost impossible for a single human to manually evaluate each word, phrase, and sentence from the top 20 Google results. Especially for search queries with long-form content.
Still, this process is not as easy as counting the number of times a given phrase is used and dividing it by the total word count. Google doesn’t treat all phrases equally for all keywords. We also have to take into account additional factors, like context or sentiment analysis.
Trusting a tool like Surfer to do the calculations for you is a much better choice.
Our algorithm has undergone a serious makeover. The lists of words and phrases you should use and the questions you should answer are new and improved. The suggestions you receive in Content Editor & Surfer Audit will now be even more accurate and better suited to the SERPs of today.
We considered a ton of new factors and applied cutting edge technologies to help you take Google by storm with your content.
I will tell you a bit more about them in the next sections.
For now, just take a look at what the recommendations in the Content Editor used to look like, and what the look like now:
It means you will get:
- More recommendations,
- More accurate recommendations,
- Much fewer irrelevant terms to manually exclude.
The three components of the semantic evaluation
I've spent plenty of time developing the best possible logic to evaluate words that appear on websites.
My goal was to deliver a solution that would no longer treat the obsolete keyword density as the crucial factor.
After all, you cannot just slap your keyword into a text a few times and wait for Google to elevate you in the SERPs. Google, just like your visitors, actually cares about what you wrote. The meaning and overall impression of the text became vastly important.
To make our new algorithm accurately evaluate the keywords by considering semantics, it had to combine the following three elements:
- The evaluation of the visual aspect of the website,
- The use of AI and Machine Learning,
- Semantic and correlation analysis.
We considered and incorporated all of them while preparing our algorithm.
What factors does Surfer consider while assessing the prominence?
There are many more factors to consider when assessing word prominence than just the number of occurrences. Here’s what our algorithm takes into account after the update:
- A word or term placement. To put it simply, the higher a keyword appears, the more important it is. And vice versa: the lower it’s located, the fewer points for placement it gets.
- Font size. Surfer verifies the average font size and assigns higher scores to the words & phrases written in a bigger font.
- Font weight. If a word or phrase is in a bold, strong, emphasized format, it gets a higher score.
- Font decoration. A word or phrase is more visible and essential if any sort of decoration is applied to it.
- Font opacity. Applied opacity decreases visibility and readability so it lowers the score.
- Hidden terms. If a term is hidden, its value gets reduced.
- Links. If the term is used in an anchor that leads to another website, the value drops.
- Contrast. When the contrast is lower than average (due to a change in font or background color), then the importance of a particular word or phrase is lower.
- ALL CAPS. In the case of words written as ALL CAPS, Surfer slightly increases the score.
- Headings. Phrases placed in headings are also considered more prominent.
Which technologies have we fueled our new algorithm with?
We don’t just use basic math to assess prominence. With the Google algorithm getting more and more sophisticated, we know our efforts have to match!
Natural Language Processing
Here’s where the real magic happens. With the "NLP" option turned on, Surfer evaluates salience for every word, phrase, sentence, and segment of all websites in top Google's results.
Machine Learning components of the semantic evaluation process help enhance analysis and provide the most accurate suggestions.
Machine Learning and AI incorporated
Our algorithm evaluates salience for detected entities using Google's BERT methodology and API designed for NLP. Google algorithm uses the same set of tools to understand human language.
Is the keyword density obsolete?
It’s not that easy to answer this question!
On the one hand, it would be naive to believe that Google's algorithm relies on density while so many other signals are available.
On the other hand, keywords that frequently appear in the content are the most prominent in many cases.
Therefore, page optimization based on keyword density can help you rank higher. But incorporating more advanced metrics based on Big Data analysis, AI, and Machine Learning will be an extra boost to help you leave the competition in the dust and increase your chances of climbing up to the top 10.
The algorithm is available ONLY in Surfer
After months of tests and analysis, we’re proud to say that we implemented the new and more reliable algorithm that selects the most prominent words and phrases. Yup, it’s up and running!
We could finally eliminate some non-intuitive sections and provide better value, with even less effort from your side.
I hope that with Surfer's help, you'll get even better results and further streamline your work!