Beware of the Google Over-Optimisation Update/Penalty for SEO Tricks

Many webmasters have been left scared after Matt Cutts announced an upcoming “over-optimisation” update and/or penalty. People all over the world are now scrambling to clean up their acts while others consider giving up SEO altogether out of pure fear. Some SEO experts have dismissed the fear as irrational, while others tried to explain what the update might be about.

It’s difficult to find a proper definition of what “over-optimisation” is, it sounds like an oxymoron, as usually you can’t over-optimise something. Optimisation is a synonym of improvement. Can you improve your website too much? No, it’s rather about a few common SEO tricks that have plagued the Web for years by now.

Luckily Matt Cutts already has given us a few hints what he has in mind and what the Google engineers are currently working on.

In this post I’d like to explain what Google most probably considers over-optimisation. In the second step I will attempt to suggest long term SEO strategies that will make you immune to this over-optimisation update and future updates of this sort.

Risky SEO tricks Cutts mentioned include:

Keyword stuffing

For years misguided webmasters and even some SEO experts have been using so called “keyword density” to measure site optimisation. According to the keyword density logic the more frequently a keyword is mentioned on a page, the better.

Keyword density is a remnant from the ancient past of SEO. It stems from a time before Google, when search engines used to judge website relevance mainly based on the actual number of keyword instances occurring on a page. When Google appeared on the Web, the game changed. From then on, links became the crucial factor. Still to this day, keywords do have to be mentioned on a page in order to make them relevant to the query but keyword density is a completely irrelevant metric.

Of course, you can’t not mention the actual thing you write about. Still, many people still assume that mentioning a term more often will improve their search engine rankings. Google has been been penalising overtly keyword-stuffed pages for years. Even last year’s Panda update targeted this SEO trick.

So how can Google step up their efforts here? It could be page titles for example. Many websites only feature keywords and keyphrases there to this day. In contrast natural language or whole sentences are used by journalists or average Joe webmasters who don’t overdo SEO.

Link exchanges

For years link exchange or simply swapping links has been one of the most popular link building tactics. I link to you so that you link to me. Of course this tactic went over the top and Google had to act. The search engine did repeatably discounted so called reciprocal links and automated link networks among others. People came up with so called three way link exchanges and similar techniques to overcome the penalties but reciprocal links haven’t been as popular since then.

Still link exchanges remained one of many standard link building techniques many people, especially average webmasters have used over the years. Some sites still rank with such links.

So what can Google do about it that hasn’t been done yet? Google can enhance its bias towards brand signals. Exchanged links are in many cases unnatural to use Google’s own term. Either they look unnatural by themselves as they use exact match anchor text, or the way they appear on the Web looks suspicious. When you add too many links in a short period of time it can seem unnatural as well.

Brands in contrast get links all the time, whenever someone refers to them and the people linking to them rarely use optimised anchor texts. Just think about it, when you write about Amazon, do you link the word Amazon or do you use “auction”, “auctions” or “auction deals” to link to it?

SEO issues Cutts probably has also in mind:

  • anchor text
  • unnatural link profiles

What is considered not over-optimised and thus runs no risk of a penalty? “Great content” on a “great site”. This is as superficial as it can get. Sadly Google still does not disclose their ranking factors unlike competitor Blekko,where you can look them up right below the search results. So we can only guess on what Google considers to be great content or a great site based on the past explanations given or SEO tests run.

What you can do in order to stay on top of the game despite recurring updates?

Think about people not search engines. Google is always trying to mimic searchers. So you should mimic search users as well not Google who is only mimicking them. Think about humans first and then consider the current ways in which search engines in general or Google in particular attempts to fulfill their wishes.

Over-optimisation always happens when you think about search spiders first and search engine users second or even forget the actual users.

 

Combine your SEO efforts with CRO (conversion optimisation) and user experience design. Make sure that the people who arrive at your site from search do find what they seek. Landing page optimisation, AB testing and user testing make your sites usable for humans almost as a side-effect so that any dreaded over-optimisation will be cleaned during the process. Readable copy, white space, usable forms, to the point descriptions and striking calls to action are some of the main elements for a successful conversion oriented website.

 

Consider user intent and ideally provide content and solutions for all three common user intents: navigational, informational, transactional. Make sure your users find exactly the what they need as expressed in the search query. Make sure they find enough background information on it by providing a glossary, an FAQ and other educational content. Make sure they find the product or service or a link to a matching product or service. Old school SEO often ended at the moment when the user clicked on the search result which led to “over-optimised” sites.

 

Optimise your site as if Google didn’t exist. Make sure all your visitors, direct visitors, social media visitors or referrals do find what they need. Try user personas and traffic segmentation to offer each audience what they are after. The returning visitor wants to know what’s new. The casual social media visitor wants to know why s/he should come back next time. The visitor coming from a referring site wants to find exactly what the other sites suggested s/he will find.

 

Write for social media users not just searchers. Most search users expect to find the keyword they entered into the search box on the page. They ignore fluff and hyperbole and focus solely on the keyword. Social media users are often different. They don’t look for a particular result they have on their mind. They are after the inspiration, something they don’t know yet so they can’t even search for it. Some bloggers add adjectives like “awesome” just to make social media users notice their articles. Find the middle ground between the SEO friendly and human readable enticing writing and do not just reduce title tags and file names to keywords.

Recent industry articles on the upcoming over-optimisation update:

Are you afraid of the over-optimisation penalty? Do you care? Do we have to beware of the upcoming update or just keep calm and keep on doing SEO as usual? Please share your opinion in the comment section!

Images courtesy of the Blind Five Year Old and Bart Heird

Comments

  1. Miguel Salcido

    It will be mostly based on over optimized anchor text. But here’s the kicker, over optimized internal linking will count too, only from and anchor text POV.

  2. Fred

    In the “old days, keyword density was “insert as more as you can get the words you wich to rank for”, now they have measurements. For example, for a page about SEO and that word is used 3 times and there are 100 words on the page, the keyword phrase density is (3*1/100)*100 or 3 percent. The Keyword Density gets between 1 and 3% for a page.

  3. SEO

    Now a days, google updating its rules and policies regularly. I read complete article, and i already listen from many person that link exchange is best but i ready some here that Link exchange is spammed and it is not effective now a days..

    Now i am confused, where to go..

  4. Arbtech

    Massive update with huge implications. The question is, if G still uses links to inform its algorithm, how do you attract links if you’re writing for a customer base that doesn’t care about ‘infographics’ or video, or anything else worth linking to. Plenty of industries e.g. sellers of tins of paint, have mundane products that sell on price/convenience/product quality (i.e. NOT on the quality of the web user experience), and for these businesses, the only way to get any links is to buy them. If Google really wants to stop people buying spammy links, it needs to stop using links altogether as a vote of popularity/quality. Not saying that’s easy, or that thee is a viable alternative right now, but that’s the situation people.