Amped Social

Get Amped Online

Brain Sparks

Catch up on all the news and views from our Amped team; our latest brain spark comes from…

Has Google Found the Answer to Negative SEO?

by admin on 20 July, 2012 at 17:33

A couple of days ago, Matt Cutts (head of Google’s webspam team) announced a new feature in Google Webmaster Tools, which enables webmasters to download the most recent links to their website.  Links can be downloaded as a CSV or Googledocs file from the “Links to your site” tab.

 

What does this mean for site owners and SEOs?

It is clearly a significant development on several levels.

Firstly, we now know exactly which links Google is finding.  The ‘date’ column also tells us when the links were found – at this stage, to the day rather than hour, but this is useful nonetheless.

There are of course existing tools that have enabled us to get a pretty good idea of this; Open Site Explorer, Majestic and even free sites such as Ahrefs and Blekko.  But none of these services have an index that is as large or indeed as fresh as Google.  So far, we have only had access to ‘indicative’ rather than ‘exact’ data.

This is all fantastic stuff. However, the potential affects of the recent links tool could be far more significant – it could help Google to solve a problem that has caused a great deal of anxious chatter in forums over the last few months:

 

Is this going to help Google kill “Negative SEO”?

Google is giving us this information for a reason.  Clearly they want webmasters to know the kind of links that are pointing to their site, where they come from, and at what velocity they are being built.  Webmasters who remain vigilant can now spot when their site is being subjected to a vicious spam link campaign.  And here’s the crucial bit: when they do identify such activity, they can take evasive action, perhaps in the form of a reconsideration request before Google sends out the dreaded unnatural links message.

How would this end negative SEO?  Well, site owners, webmasters and SEOs are not going to report links that they have built on purpose.  They will report suspect links that they have not built.

Many people have talked about having a ‘disavow link’ option in GWT as a solution to negative SEO.  However, the problem with this idea is that it provides a get-out clause for people willingly building dodgy links.  Some have suggested a ‘three strikes and your out’ idea, whereby you get to disavow links the first two times you receive an unnatural links message, but a penalty is applied on the third.  That does not particularly help websites that are subjected to a sustained campaign, and so this is only marginally better than the current situation.

Giving webmasters the chance to actually see links, evaluate quality and quantity, and then report spam before Google give you the heads up, will surely go some way to removing the opportunity for abuse.  If you do not report the links within say a 2 week period, Google then has grounds to suspect that you intended to build manipulative links.   You then go through the existing reconsideration request process.

If this were the case, the new ‘download recent links’ tool provides innocent webmasters with an additional buffer between the links being built and having to manually go through your backlink profile to remove links.

 

The Losers

It’s not a perfect solution – that would mean that zero innocent parties get hit.  The victims here will be those who are not vigilant, either because they do not use webmaster tools – like the site described here – or simply do not bother to check. Sorry to have to say this, and please don’t launch a physical or verbal attack… but that’s kind of tough luck!  If you do not use the tools and information Google have given to you, then you can expect to surrender your right to complain when you do get hit.  You need to be savvy to run a successful and fully protected website these days.  But what would you rather do?  Surely the easier option would be checking GWT once in a while for a) the number of links that are being built and b) the quality of the sites they come from.  That requires far less ‘savviness’ than trying to understand why your site has plummeted unexpectedly, identifying problematic links, and then having to go through the long process of having them removed.

 

The Winners

The winners are ultimately the sites that build high quality links from relevant resources (or just don’t build them at all!) and who are also vigilant in checking webmaster tools.  You might argue that they deserve to be saved, and other sites should aspire to conduct themselves in a similar manner.  Is that not better than leaving everybody at risk of negative SEO?

Let us know what you think by Tweeting @ampedsocial.