You know, getting a website to Google top (especially if a website is young) is a long & winding road and only a few owners are actually ready & willing to go all this way with the one+only freelancer (=me). So as a rule the most often they just load me with a little task covering just one side of the process. So below is the types of works I usually handle.

Technical audit

The aim of technical audit is to make sure nothing prevent your website from being indexed and ranked by Google. It is related to basic html correctness, speed and mobile friendliness. Plus sitemap and robots txt though as a rule this is not critical – Googlebot is able to crawl your website itself.

The quality of website is indicated in Google Search Console – that is why I always recommend to submit a website to GSC right after completion.

One more point is to install Google analytics counter – it’s useful to know how many visitors your website attracts, the way they found you and how they behave.

The last but not least is some unlisted kind of work I still would handle. Something about schema and Google snippets (the way your website is visible at SERP – search engine ranking page), or how to persuade Google to index more website pages than it did.

Keyword research

In fact the initial point of any website – it actually should go even before you launch development phase. Any keyword have its traffic within the target region and it means how many organic users you website will obtain IF you get ranked in Google top10 by that keyword in that target region.

The best place to start with it is Google Ads Keyword Planner which is free (though won’t provide you with actual traffic data until you run ad campaign, only a range: 10-100, 100-1k/1000 etc.). The process is quite simple – just enter competitor’s website, target region (your town, province or country or even the whole world) and collect keywords ideas. For a town 10-100 range would be quite good though for a country isn’t enough.

To get exact traffic data the best is to register for a trial period at some online service. Also it would matter how difficult the keyword is (how hard to achieve high ranking by it or how competitive it is). Do not mix it with Google Ads competitiveness – it is related to ads at SERP solely.

The target keywords will define the whole future website – it’s URL’s/structure, meta data and the very content plus backlinks anchor text. The best is domain name contains the primary keywords – like mine (:

SEO optimization (on-site SEO)

The substantial phase of getting your website to Google top. Usually website owner comprehends this phase is needed months after website was launched. And no organic traffic which is natural because the site is not ranked even in top100.

Basically at this step I make sure your website elements contain required keywords and related/LSI queries in required quantity. Precisely, Google does not make any difference between keywords and LSI queries at least from 2019. The simplest routine to find out what is really needed is to analyze competitors ranked in Google top by your target keywords.

The target keywords are then mapped against pages to know by which keyword each page should be optimized. Often website owner have no idea about the target keywords when one starts this phase. Lots of website owners create a bunch of tags for a handful of pages/posts most of them are irrelevant and should be deleted at SEO optimization phase. And vice versa blog categories may miss relevant ones so should be created anew.

Again, SEO optimization touches URL’s/website structure, meta title, description and img alt, the very content including H tags plus backlinks anchor text (this one is related to backlink building which as a rule comes after optimization).

Website value consists of 2 parts – website itself and links to it published at another sites. The more backlinks published and the more relevant anchor text they have and the more relevant are websites they published at the higher authority a linked domain would has. And vice versa. That is why every website owner should know about Google disavow tool.

A simple analogy from the real life. You want to upgrade your netbook. There is a handful of PC upgrade companies in your neighborhood. What you will do is to collect their users’ reviews and figure out which is the best. In fact you just calculate their authority. The same is about your website, but now it will be Google who calculate your website authority while users’ reviews will be backlinks to your website.

Which domains are the best for publishing backlinks? The answer is the relevant ones. Though you may use free website catalogs but they won’t get you to the Google top. As with SEO optimization it make sense to look at the ranked competitors’ backlinks and try to imitate them. You may use megaindex linkAnalyze tool or SEMr.ush or anything like that.

Which domains are the worst? The ones related to vi.agra,, easy mon.ey, pron/sxe etc. Of course if your website is not related to these niches.

What about anchor text? Again, check your competitors. There is a few types of anchor text used the most often:

  • domain name (also can be empty) like
  • image
  • empty
  • containing keyword (directly, indirectly) or related to keyword, say read more on keyword

And the last but definitely not least. The proportion of different anchors is matters as well. The best is to copy the ranked competitors.

Competitor Analysis

In fact this is analysis of ranked competitors that can be found in SERP by your target keyword. The aim of analysis is to find out why Google ranked them in the top.

Initially one have to figure out the relevant competitors, skipping portals and wikipedia-like monsters. Then it make sense to short-listed candidates by number of pages/posts and domain authority. Say, if your website is young the better to skip established authorities and concentrate at the same young/lean.

The most important points are the content (number+quality of pages) and quantity of backlinks (plus relevancy of donor domains). It also matters how often new blog posts are published, is their content unique and are blog posts linked to website pages. Backlinks would show you which anchor text is used and which types of donor domains (directories, forums, blogs etc.) are present. Structure of URL’s and meta data should be also taken into consideration.

Basically, it’s always useful to learn by the best examples!

Language localization

Nowadays more and more Western companies are peering at the Russian market. Some of them are so brave they come at it which is impossible without a proper Russian website. That is where they hire me to help with Russian keywords. You know, except the mysterious Cyrillic alphabet, Russians use the words which are completely different from English.

So my work is to find Russian equivalents for English keywords including traffic stats – both for Google and Yandex. The latter generated comparable traffic and obtaining traffic stats manually is really not easy.

Which sources I use? Again, the competitors the most of all – the both Russian ones having English website mirror and the foreign ones having Russian mirror. Keyword extracting tools allow to collect every possible piece of valuable data. One should only to filter it very carefully to exclude not relevant pieces using categories and stop-words.

The same principle will work fine with any pair of languages – English+German/Deutsch, English-French/Francais, English-Spanish etc. Not necessary to speak German or French to do this work, basic knowledge is quite enough.

I did keyword localization for such world-known companies as Pfizer, Pirelli and Henkel.