Several months ago I have decided to test the Surfer (data-driven SEO tool which I co-create) long term capacity. My goal was to check whether the changes in the website positioning, which are usually observable for several days after the implementation of the optimisation, are going to hold and how it is going to translate to the real-life clicks.
The test consisted of using Surfer to onsite optimisation as well as to find the most attractive spots for linkbuidling.
I used my finance and education-related webpage — progresywni.eu . The webpage in its first iteration was a landing page of my society, which used to organise events addressed to young people. We did not do SEO, our content was dedicated to the users of social media. As a result of our statutory activities, several dozen entries were created, which (despite their high-quality substance) were generating minimal organic traffic.
How the experiment was conducted?
1) Technical adjustments
Firstly, I have made some technical adjustments to the webpage such as:
- The photo compression
- External linking between the entries;
- Installing a plug-in which optimises the speed of the webpage’s loading.
At the end of December, I have also changed the hosting to the one dedicated to Wordpress, which for several days made the webpage offline, which in turn slightly disrupted the end results.
The webpage had Yoast SEO plug-in installed, thanks to which I reduced the thin content, that is subpages tags, which in the case of this webpage did not produce any quality, but have used the search engine’ crawl budget.
What is important, the changes* were introduced several months before commencing the optimisation in order to eliminate any chance of the technical adjustments’ influence over the results.
*some internal links were obviously filled in when optimising the entries
2) The choice of entries
Of all available entries, I have chosen several which had the potential to rank for phrases with high volume. Moreover, during the experiment, I have commissioned the creation of several additional entries, which I have also optimised.
The phrases of my choosing (in Polish):
- How to learn?
- How to develop?
- How to focus?
- How the ADM works?
- Where to exchange the damaged banknote?
- How to relax?
- Where to buy the paysafecard?
- When the transfer will come?
- How to calculate the investment’s profit?
- How to organise day?
- How to have more energy?
- How to save money?
- SEPA or SWIFT?
- Identifying the bank via the bank account number.
The entries were optimised in accordance with Surfer’s guidelines, which bottom line was to decrease or increase the number of words, to edit headlines, to add keywords in approximate fitting and — most importantly — to eliminate the content gap issue via complementing the content with common elements such as words and phrases. I write more about this topic below.
During the experiment, I deliberately acquired one dofollow link from SurferSEO.com and two nofollow links. It is worth to mention that several education-related domains organically linked the progresywni.eu webpage.
3) The external factors
The example of domains which link to progresywni.eu:
How the optimisation of the given entry was conducted?
Long story short (tips for people who rush to take action):
- The decision to choose the main phrase, on which I wanted to position myself.
- Analyzing the competition in Surfer.
- Looking out the biggest problems with my subpage.
- Introducing optimisation.
- Submitting the webpage to the Google index.
- The possible acquisition of links (it happened only for one entry).
- Possible adjustment (i.e. adding more common elements to the content when at the same time keeping the high quality of the webpage substance — it happened three times).
- Analysis of the traffic and visibility.
Read more about common elements in this comparisot: TF-IDF for SEO and its alternatives - Prominent Words and Phrases.
The selection process of the main phrase
When choosing the best phrase I followed the directives of volume and competitiveness. By competitiveness, I mean the quality of top10 subpages and the number of links which they have. By using those two metrics you can knowingly define your priorities.
Analysing the competition in Surfer
Secondly, I have analysed the chosen phrase in Surfer. I will show you how I have done this on the example of the entry which only just gets traction online.
Originally, the optimised subpages were not ranked in top50 for the chosen phrases, so I used the Custom URL feature, which enables the comparative analysis with your most successful competitors. If your webpage is ranked among the top 50 you can use “Filter by URL” feature.
Let’s begin with a quick audit
The Audit contains simplified optimisation guidelines with the digest of the common elements, which have not been already taken into consideration in the content. It is very important because every time give us information only about such words or phrases which we omitted (black text)
As you can see in the screenshot, the analyzed subpage has no errors since they were eliminated in the optimisation process.
Check out the exemplary Audit with some errors to eliminate here.
The Audit is generated on the basis of the first five visible URL addresses in the given SERP. We can adjust it according to our needs i.e. by eliminating the unwanted URLs from the SERP.
It happens that the top5 contain the webpages which are not our competitors and thus disrupt the values of the factors. Such webpages are usually websites aggregating i.e. ads and offers (eBay, amazon) or strong domains (Wikipedia, The Times) In order to adjust the audit to one’s preferences you can eliminate all webpages from the analysis and then check the “eye” symbol next to the websites of your choosing.
If the implementation of the audit’s optimisation translated to the satisfactory growth of the position, then I have not done any more adjustments. In every other case (i.e. the entry about “how to relax”) I had to conduct a detailed SERP analysis.
It happened that the optimisation of the first factor (i.e. the amount of iterations of the particular word form the key phrase) had an influence on the growth of the position.
The assessment of the problems and optimisation
When I have prepared the list of the issues to amend I moved to implementation. I add chosen words, reduced the density of the key phrases or added more graphics. For each entry, it was a totally different set of elements, every SERP (page with results) is different, so different is the optimal value of the factors of every phrase. That is why it is worthwhile to do a separate analysis for every phrase we want to position.
“słów” means “words”
Take notice that the optimal density of the keywords in the approximate fitting for four SERP in one niche can be completely different.
The last operation, which I undertook just after making the new versions of entries public, was submitting the request to Google Search Console to index the subpage again.
As I mentioned before, for one of the entries I acquired two nofollow links form the webpages which were ranked in the same SERP.
The plan was clear: I will find the webpages from which a large amount of external links derives. I concluded, that if others succeeded, I will also be able to acquire a reference.
In the case of the “when the transfer will come” phrase such links could be obtained on the financial forum and under one article posted on one of the local webpages — in the comment with the optimum anchor.
Finding such places with Surfer is super easy. It only requires you to go into the “Links” tab and then check “External” or “External Nofollow”
Commissioning the new entry
It was important to me to previously optimise the content during the time of the experiment. Before commissioning the creation of the entry I prepared the drafted list of the guidelines, which included the length of the contents, the recommended topic, keywords etc.
I published the content and with the help of the YoastSEO plug-in I disabled the possibility of the particular entries indexation by the search engines in order to make the final content adjustments before submitting to the Google Search Console and again enabling to indexation.
What are the experiment end results?
(46 500 x 100%) / 1560 = 2980 %
As seen, the blog which generated almost none clicks, since May 2018 started to generate more and more organic traffic.
The costs of the experiment
It is hard to estimate the value of the time you spend. The optimisation of the single entry took no more than an hour. Adding the selection of key phrases, interim link building and commissioning the text to copywriters, it took approx. 20 hours of work.
Financial-wise, excluding the costs of the server (25$/annually), I invested 242 euro in copywriting. The rest of the entries optimised in this experiment was previously available on the webpage and as those entries were made by the members of the non-profit organization it is not possible to estimate their costs.
Thoughts and plans
The end results were quite satisfactory, especially regarding the financial input and time spent. I already have some ideas for another test and case study on the Anglophone market.
I am curious about your opinion! I hope that the time you spent reading this article will come back to you when using our know-how.
At the end of December, the webpage was offline for several days due to the transfer to the different hosting.
Author: Tomasz Niezgoda
E-mail me! [email protected]