Is Negative SEO Real?
I’ve heard a lot recently about negative SEO and the possibility that a competitor or someone could sabotage our website and make us lose traffic. Is this true and how likely is it to happen? Can we do anything to guard against it happening to our site? All our new business comes through Google and our website.
Deeho Replies:
With the introduction of the Google Penguin algorithm Google created the possibility of negative SEO.
Prior to Penguin, back-links either helped to move you up the search ranking or had no affect at all. This meant that you could safely generate links from anywhere and they couldn’t harm your rankings, traffic and site performance.
When Google created the concept of bad link neighbourhoods and began to penalise websites that had links from them, it reintroduced the possibility that a competitor, black hat or other evildoer could actually damage your website in search.
Google has a webspam team that actively hunt down link networks, bad neighbourhoods and any linking services that are manipulating their search results. When they identify a network, they not only remove that network from their index (which in itself would just remove any link benefit from your site) but they also apply penalties to the sites that they find benefiting from the network.
In the short term this works as negative SEO as it instantly penalises your site and you will see a drop in SERP’s.
How Could Someone Negatively SEO Your Site?
Implementing negative SEO has never been easier and is becoming a major problem for webmasters.
There have been high profile examples of link networks harming rankings, so the easiest way to instantly damage a competitors rankings would be to generate a large volume of links for networks that Google has recently penalised.
Google tends not the continue these penalties for long, because their index is theoritically ‘self cleaning’. When theu find a spam network, they apply a penalty to the sites that were found to be using the network either a/ until they remove the links b/ disassociate themselves from the network or c/ the network is deindexed and the value positive or negative is negated over time.
Does this mean that a new site that starts to use a known linking network that Googlw has already penalised and deindexed would be penalised? The jury is out on that one. Although we haven’t tested the theory, I would suspect that the links would have no value once the network has been deindexed.
“Back-link rule #1: Build links from authority sites that are in Googles index.”
More Negative SEO Strategies:
The main ways to damage your own rankings also work to harm your competitors rankings, so the areas your could theoretically manipulate are;
1/ Link Text Density
2/ Link Networks
3/ Paid Links
4/ Duplicate Content
In theory, all you would have to do would be to create a large volume of duplicate content on low quality sites, all linking with the same exact match link text in order to affect the rankings of a competitor.
To date there haven’t been many high profile examples of negative SEO, but the practice does exist. See Rand Fishkin’s SEO post here, or SEO Round Table’s and SEOBook’s thoughts here.
Negative SEO is going to become more of an issue unless Google’s spam team address the problem. For them, it isn’t easy however, because they want to be able to apply negative ranking signals to sites who persist in consorting with undesirable rank manipulating sites online.
Without that element to the Penguin algorithm, they would be reverting to pre-penguin rankings where low quality links had no value, rather than being harmful.
From Googles point of view, the only way they can focus website owners minds on playing nicely and abiding by their stringent rules and best practice is to continue to apply penalties to sites that persist on trying to improve their rankings through manipulation.
We can expect to continue to see more networks penalised as Google isolate them and de-index them. This is an ongoing issue for the web spam team and there is no chance that they are going to change tack and stop this strategy.
By the same token, the SEO industry will continue to create new, improved more covert networks in the hope that Google can’t find them.
The reason that Google hates link networks so much is because they work as a low cost and effective method of ranking manipulation.
From an SEO perspective, it is always worth remembering that:
“Google works hardest to catch web masters manipulating the areas that have the biggest impact on rankings.”
Ask yourself why Google dedicate so much time, money and effort to finding & removing link networks from their index? Why are they penalising Guest blog networks? Why are they focusing on these areas if they don’t have a significant impact on their SERP’s.
In an ideal world, Google would flick a switch and instantly rule out specific types of links, but because of the nature of ‘Guest Blogging” for example they can’t.
If you obtain an editorial link on a high authority themed blog then it is a good link to have, but to the untrained eye, the same link on a networked guest blog looks just as valid, but is cheaper and faster to create.
The only way Google can differentiate between the two is to identify the blogging network and then penalise it, otherwise it would have to penalise every blog in its index.
For SEO companies, the challenge is how to create an undetectable network that will have longevity.
The biggest issue then becomes finding customers to use the network without attracting Googles attention. If members of the public can find your network to join it then Google can find it too.
The issue of negative SEO is not going to go away and I believe it will become more of an issue going forward.
What google will do to address the issue is anyones guess at present, but with so much at stake for top rankings, there will always be web masters who are prepared to do anything to improve their search engine traffic.