Content optimization is a complex task, which, due to lack of tools, most of the times is based on gut feeling and experience. We introduce Surfer which opens the gate to optimization solely based on data and facts.
To start, just type your keyword in the main input of the user page and hit enter. You can choose search location by clicking on the flag or country – results will accurately match real search from given location. If you can’t see your location, don’t worry, just drop us an email on firstname.lastname@example.org and we will add it immediately.
This time I wanted to analyze pages for Black Friday Online.
The analysis takes around 30 seconds to complete. Just click on it when we let you know it’s ready to open the analysis view. You will be able to compare over a hundred ranking factors between highest ranking competitors, including:
The first impression may be a bit overwhelming, so let me explain what’s going on.
Main application view.
Central section of the interface is occupied by the chart. It shows the ranking position on X and factor value on Y axis.
Below the chart you will find a list of URLs that were analyzed. Next to each URL you can see two icons: compare and exclude from chart.
Left side of the interface contains list of factors. They are grouped in three main sections:
Top section shows factors visible on the chart with corresponding color label. You can dismiss factors from the list or keep it at hand.
Recent factors list with color indicators.
You can also notice a red/yellow/green indicator next to each factor, which indicates correlation strength.
Better correlation means stronger relation between the factor and position in SERP. But think of it as a suggestion rather than a definitive answer - it’s just one of many indicators which factors you might want pay attention to.
You can find out correlation strength by looking at indicator next to the factor name.
If you want to check out how is your online page doing, you can compare it with competitors. Find your page on search results list, or if it’s not there, type address in custom URL input. Then just click the chart icon, select a factor and you are ready to go. This setup is perfect for finding missing keywords or over optimization.
Please keep in mind that for now, comparison is enabled for single factor only. Lifting this limitation is on our roadmap.
You can enter custom URL here if you don't rank in TOP 50 yet.
As you can see on the following chart, some additional partial matched keywords on currently selected site could really help here and make the page rank higher. This kind of insight allows to quickly tweak up any available factor.
Partial matching keywords count for 20th URL in SERP.
First input can be used to filter URL addresses with any sentence. It will highlight pages on the chart and filter URL list underneath. One example of usage is determining how many competitors use keyword in URL. It also allows to quickly find your website if you need to compare it.
Chart view with filtered results.
Averages switch toggles between averaged and detailed chart. By default it groups results by tens, but you can change this number to anything you like.
Outside links switch changes the way we calculate factors. When it’s turned on,
we don’t account any words that is placed directly or indirectly in
It’s useful for pages with many
<a> tags, like e-commerce. It’s enabled by
default since we believe Google penalizes content inside anchors, but mileage
may vary on different searches.
Outside links turned on.
Outside links turned off.
Let me walk you through complete analysis workflow. I'll show you a use case where I created subpage content based on signals from the app.
I was working on a brand new subpage, so it was crucial how many words should I use, how to stuff content without over-optimization and how to build optimal content structure.
I started by determining optimal structure for the content.
To do that, I checked average factor values from top 10 results. I decided go with words count, and number of headings, paragraphs and images basing on my experience.
I wrote down the numbers and created the structure which matched those closely.
Word count, headings, paragraphs and images count was chosen to determine optimal content structure.
Once the structure is set, it needs to be filled up with content. Thanks to Surfer it was very easy to figure out how many exact and partial matching keywords should I use.
After a while of browsing the chart and going through many different factors, I found out that for this particular search phrase, using more partial matching keywords will probably rank better than stuffing the website with exact matching ones.
For this particular phrase using more partial matching keywords is better than stuffing with exacts.
You could just check keywords count within body, but I decided to dig a bit deeper. Instead of having random keyword placement, I looked up the elements which should contain them and determined optimal keyword density. I checked how many partial matching keywords do pages have in meta title and headings. I used partial matching keywords ratio in paragraphs factor to check what's the correct density.
Surfer makes it easy to see how many keywords should the website have and where to put them.
I was very pleased with the results. Optimization based on very commercial keyword ranked on 35th position average just after indexing it and ended around 10th with a lot of traffic.
Side effect of my actions was bringing up 271 visible keywords, reported by Google Search Console.
Search Console is happy so the client is happy!
Page was created two weeks before Black Friday and ranked up rapidly. These results were possible to achieve only with help of Surfer. Of course I could prepare such a comparison by hand using Excel, but it would take ages to complete.